Apr 24 21:13:48.088935 ip-10-0-139-15 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 21:13:48.088946 ip-10-0-139-15 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 21:13:48.088956 ip-10-0-139-15 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 21:13:48.089266 ip-10-0-139-15 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 21:13:58.232132 ip-10-0-139-15 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 21:13:58.232152 ip-10-0-139-15 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot dacb31be8aed49f1b338e2e5be46fa09 -- Apr 24 21:15:59.202710 ip-10-0-139-15 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:15:59.654122 ip-10-0-139-15 kubenswrapper[2581]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:15:59.654122 ip-10-0-139-15 kubenswrapper[2581]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:15:59.654122 ip-10-0-139-15 kubenswrapper[2581]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:15:59.654122 ip-10-0-139-15 kubenswrapper[2581]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:15:59.654122 ip-10-0-139-15 kubenswrapper[2581]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:15:59.656282 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.656195 2581 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:15:59.660170 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660155 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:15:59.660170 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660170 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:15:59.660238 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660175 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:15:59.660238 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660178 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:15:59.660238 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660181 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:15:59.660238 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660184 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:15:59.660238 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660187 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:15:59.660238 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660189 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:15:59.660238 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660192 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:15:59.660238 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660195 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:15:59.660238 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660197 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:15:59.660238 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660200 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:15:59.660238 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660203 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:15:59.660238 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660206 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:15:59.660238 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660209 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:15:59.660238 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660211 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:15:59.660238 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660214 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:15:59.660238 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660218 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:15:59.660238 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660222 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:15:59.660238 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660225 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:15:59.660238 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660228 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:15:59.660715 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660231 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:15:59.660715 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660234 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:15:59.660715 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660237 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:15:59.660715 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660240 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:15:59.660715 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660243 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:15:59.660715 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660246 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:15:59.660715 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660248 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:15:59.660715 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660252 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:15:59.660715 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660256 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:15:59.660715 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660259 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:15:59.660715 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660262 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:15:59.660715 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660264 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:15:59.660715 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660267 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:15:59.660715 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660270 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:15:59.660715 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660272 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:15:59.660715 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660275 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:15:59.660715 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660278 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:15:59.660715 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660280 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:15:59.660715 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660283 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:15:59.661168 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660285 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:15:59.661168 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660288 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:15:59.661168 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660291 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:15:59.661168 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660293 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:15:59.661168 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660295 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:15:59.661168 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660298 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:15:59.661168 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660302 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:15:59.661168 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660304 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:15:59.661168 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660307 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:15:59.661168 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660309 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:15:59.661168 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660312 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:15:59.661168 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660314 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:15:59.661168 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660317 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:15:59.661168 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660320 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:15:59.661168 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660323 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:15:59.661168 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660326 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:15:59.661168 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660329 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:15:59.661168 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660333 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:15:59.661168 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660336 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:15:59.661168 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660338 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:15:59.661723 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660341 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:15:59.661723 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660344 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:15:59.661723 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660346 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:15:59.661723 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660349 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:15:59.661723 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660351 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:15:59.661723 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660354 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:15:59.661723 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660356 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:15:59.661723 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660359 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:15:59.661723 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660361 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:15:59.661723 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660364 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:15:59.661723 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660367 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:15:59.661723 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660369 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:15:59.661723 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660372 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:15:59.661723 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660374 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:15:59.661723 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660377 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:15:59.661723 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660381 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:15:59.661723 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660383 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:15:59.661723 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660386 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:15:59.661723 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660388 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:15:59.661723 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660398 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:15:59.662250 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660400 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:15:59.662250 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660403 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:15:59.662250 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660405 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:15:59.662250 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660408 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:15:59.662250 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660410 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:15:59.662250 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660413 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:15:59.662250 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660793 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:15:59.662250 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660798 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:15:59.662250 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660801 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:15:59.662250 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660803 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:15:59.662250 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660807 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:15:59.662250 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660810 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:15:59.662250 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660813 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:15:59.662250 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660815 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:15:59.662250 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660818 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:15:59.662250 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660821 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:15:59.662250 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660823 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:15:59.662250 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660826 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:15:59.662250 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660829 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:15:59.662250 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660831 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:15:59.662755 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660834 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:15:59.662755 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660836 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:15:59.662755 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660839 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:15:59.662755 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660842 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:15:59.662755 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660844 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:15:59.662755 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660846 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:15:59.662755 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660849 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:15:59.662755 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660852 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:15:59.662755 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660854 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:15:59.662755 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660856 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:15:59.662755 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660859 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:15:59.662755 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660862 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:15:59.662755 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660865 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:15:59.662755 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660867 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:15:59.662755 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660870 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:15:59.662755 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660872 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:15:59.662755 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660875 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:15:59.662755 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660877 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:15:59.662755 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660880 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:15:59.663222 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660883 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:15:59.663222 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660886 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:15:59.663222 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660888 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:15:59.663222 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660891 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:15:59.663222 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660893 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:15:59.663222 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660896 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:15:59.663222 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660898 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:15:59.663222 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660900 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:15:59.663222 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660903 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:15:59.663222 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660905 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:15:59.663222 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660910 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:15:59.663222 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660913 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:15:59.663222 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660916 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:15:59.663222 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660919 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:15:59.663222 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660921 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:15:59.663222 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660924 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:15:59.663222 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660927 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:15:59.663222 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660929 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:15:59.663222 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660932 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:15:59.663709 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660934 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:15:59.663709 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660936 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:15:59.663709 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660939 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:15:59.663709 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660942 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:15:59.663709 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660945 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:15:59.663709 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660947 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:15:59.663709 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660950 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:15:59.663709 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660953 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:15:59.663709 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660956 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:15:59.663709 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660959 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:15:59.663709 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660962 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:15:59.663709 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660965 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:15:59.663709 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660968 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:15:59.663709 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660971 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:15:59.663709 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660973 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:15:59.663709 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660976 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:15:59.663709 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660978 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:15:59.663709 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660981 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:15:59.663709 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660983 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:15:59.663709 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660986 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:15:59.664210 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660989 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:15:59.664210 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660991 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:15:59.664210 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660994 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:15:59.664210 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660996 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:15:59.664210 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.660999 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:15:59.664210 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661002 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:15:59.664210 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661004 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:15:59.664210 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661007 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:15:59.664210 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661009 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:15:59.664210 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661011 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:15:59.664210 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661014 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:15:59.664210 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661016 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:15:59.664210 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661019 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:15:59.664210 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661021 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:15:59.664210 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661092 2581 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:15:59.664210 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661099 2581 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:15:59.664210 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661107 2581 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:15:59.664210 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661111 2581 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:15:59.664210 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661115 2581 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:15:59.664210 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661119 2581 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:15:59.664210 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661124 2581 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661129 2581 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661132 2581 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661136 2581 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661141 2581 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661144 2581 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661147 2581 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661150 2581 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661153 2581 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661156 2581 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661158 2581 flags.go:64] FLAG: --cloud-config="" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661161 2581 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661164 2581 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661168 2581 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661171 2581 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661174 2581 flags.go:64] FLAG: --config-dir="" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661177 2581 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661180 2581 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661185 2581 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661188 2581 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661191 2581 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661194 2581 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661197 2581 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661200 2581 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:15:59.664736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661203 2581 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661206 2581 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661209 2581 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661213 2581 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661216 2581 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661220 2581 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661222 2581 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661225 2581 flags.go:64] FLAG: --enable-server="true" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661229 2581 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661233 2581 flags.go:64] FLAG: --event-burst="100" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661236 2581 flags.go:64] FLAG: --event-qps="50" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661239 2581 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661243 2581 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661246 2581 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661250 2581 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661253 2581 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661256 2581 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661259 2581 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661262 2581 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661264 2581 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661268 2581 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661271 2581 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661274 2581 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661277 2581 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661279 2581 flags.go:64] FLAG: --feature-gates="" Apr 24 21:15:59.665328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661283 2581 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661286 2581 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661289 2581 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661292 2581 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661296 2581 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661299 2581 flags.go:64] FLAG: --help="false" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661302 2581 flags.go:64] FLAG: --hostname-override="ip-10-0-139-15.ec2.internal" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661305 2581 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661308 2581 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661311 2581 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661314 2581 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661317 2581 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661320 2581 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661323 2581 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661326 2581 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661329 2581 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661332 2581 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661335 2581 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661338 2581 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661342 2581 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661344 2581 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661348 2581 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661350 2581 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661353 2581 flags.go:64] FLAG: --lock-file="" Apr 24 21:15:59.665945 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661356 2581 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661359 2581 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661362 2581 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661367 2581 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661370 2581 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661372 2581 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661375 2581 flags.go:64] FLAG: --logging-format="text" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661378 2581 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661381 2581 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661384 2581 flags.go:64] FLAG: --manifest-url="" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661387 2581 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661391 2581 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661394 2581 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661398 2581 flags.go:64] FLAG: --max-pods="110" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661401 2581 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661404 2581 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661407 2581 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661410 2581 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661413 2581 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661415 2581 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661418 2581 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661440 2581 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661443 2581 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661446 2581 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:15:59.666561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661449 2581 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661452 2581 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661458 2581 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661462 2581 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661465 2581 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661468 2581 flags.go:64] FLAG: --port="10250" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661471 2581 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661474 2581 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-083d840ce750b7483" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661477 2581 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661481 2581 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661484 2581 flags.go:64] FLAG: --register-node="true" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661486 2581 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661489 2581 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661493 2581 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661496 2581 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661499 2581 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661502 2581 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661505 2581 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661508 2581 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661511 2581 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661514 2581 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661517 2581 flags.go:64] FLAG: --runonce="false" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661520 2581 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661523 2581 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661526 2581 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:15:59.667172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661529 2581 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661531 2581 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661534 2581 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661537 2581 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661541 2581 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661544 2581 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661546 2581 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661553 2581 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661556 2581 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661560 2581 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661563 2581 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661566 2581 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661571 2581 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661574 2581 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661577 2581 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661581 2581 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661584 2581 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661587 2581 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661590 2581 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661592 2581 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661595 2581 flags.go:64] FLAG: --v="2" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661600 2581 flags.go:64] FLAG: --version="false" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661604 2581 flags.go:64] FLAG: --vmodule="" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661608 2581 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.661611 2581 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:15:59.667803 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661724 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:15:59.668542 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661728 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:15:59.668542 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661732 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:15:59.668542 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661735 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:15:59.668542 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661738 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:15:59.668542 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661740 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:15:59.668542 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661743 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:15:59.668542 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661746 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:15:59.668542 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661748 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:15:59.668542 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661751 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:15:59.668542 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661753 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:15:59.668542 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661756 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:15:59.668542 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661759 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:15:59.668542 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661761 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:15:59.668542 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661765 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:15:59.668542 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661769 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:15:59.668542 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661773 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:15:59.668542 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661776 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:15:59.668542 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661779 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:15:59.668542 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661782 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:15:59.668542 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661785 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:15:59.669330 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661787 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:15:59.669330 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661790 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:15:59.669330 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661793 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:15:59.669330 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661795 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:15:59.669330 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661798 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:15:59.669330 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661801 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:15:59.669330 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661803 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:15:59.669330 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661805 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:15:59.669330 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661808 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:15:59.669330 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661811 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:15:59.669330 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661813 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:15:59.669330 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661816 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:15:59.669330 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661818 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:15:59.669330 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661820 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:15:59.669330 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661823 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:15:59.669330 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661826 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:15:59.669330 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661828 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:15:59.669330 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661830 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:15:59.669330 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661833 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:15:59.669330 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661835 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:15:59.669909 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661838 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:15:59.669909 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661840 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:15:59.669909 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661842 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:15:59.669909 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661845 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:15:59.669909 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661847 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:15:59.669909 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661851 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:15:59.669909 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661854 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:15:59.669909 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661857 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:15:59.669909 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661860 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:15:59.669909 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661863 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:15:59.669909 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661866 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:15:59.669909 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661868 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:15:59.669909 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661872 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:15:59.669909 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661875 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:15:59.669909 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661878 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:15:59.669909 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661881 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:15:59.669909 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661884 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:15:59.669909 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661887 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:15:59.669909 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661890 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:15:59.670464 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661892 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:15:59.670464 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661895 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:15:59.670464 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661898 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:15:59.670464 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661901 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:15:59.670464 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661903 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:15:59.670464 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661906 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:15:59.670464 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661908 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:15:59.670464 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661911 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:15:59.670464 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661913 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:15:59.670464 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661916 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:15:59.670464 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661918 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:15:59.670464 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661921 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:15:59.670464 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661923 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:15:59.670464 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661926 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:15:59.670464 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661928 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:15:59.670464 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661931 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:15:59.670464 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661934 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:15:59.670464 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661936 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:15:59.670464 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661940 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:15:59.670967 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661943 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:15:59.670967 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661946 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:15:59.670967 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661949 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:15:59.670967 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661951 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:15:59.670967 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661954 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:15:59.670967 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661957 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:15:59.670967 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.661959 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:15:59.670967 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.662898 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:15:59.670967 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.670491 2581 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:15:59.670967 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.670508 2581 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:15:59.670967 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670554 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:15:59.670967 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670560 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:15:59.670967 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670564 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:15:59.670967 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670567 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:15:59.670967 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670569 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:15:59.670967 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670572 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:15:59.671369 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670575 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:15:59.671369 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670578 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:15:59.671369 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670581 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:15:59.671369 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670583 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:15:59.671369 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670586 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:15:59.671369 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670589 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:15:59.671369 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670592 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:15:59.671369 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670595 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:15:59.671369 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670598 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:15:59.671369 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670600 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:15:59.671369 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670603 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:15:59.671369 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670605 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:15:59.671369 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670608 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:15:59.671369 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670610 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:15:59.671369 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670613 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:15:59.671369 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670615 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:15:59.671369 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670618 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:15:59.671369 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670621 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:15:59.671369 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670623 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:15:59.671369 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670626 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:15:59.671896 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670629 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:15:59.671896 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670631 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:15:59.671896 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670634 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:15:59.671896 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670637 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:15:59.671896 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670640 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:15:59.671896 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670643 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:15:59.671896 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670646 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:15:59.671896 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670648 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:15:59.671896 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670651 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:15:59.671896 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670654 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:15:59.671896 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670656 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:15:59.671896 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670659 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:15:59.671896 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670661 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:15:59.671896 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670664 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:15:59.671896 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670666 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:15:59.671896 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670669 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:15:59.671896 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670671 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:15:59.671896 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670674 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:15:59.671896 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670677 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:15:59.671896 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670679 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:15:59.672388 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670682 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:15:59.672388 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670685 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:15:59.672388 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670687 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:15:59.672388 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670690 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:15:59.672388 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670692 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:15:59.672388 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670695 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:15:59.672388 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670698 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:15:59.672388 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670700 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:15:59.672388 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670703 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:15:59.672388 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670707 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:15:59.672388 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670711 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:15:59.672388 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670713 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:15:59.672388 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670716 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:15:59.672388 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670719 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:15:59.672388 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670721 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:15:59.672388 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670724 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:15:59.672388 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670727 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:15:59.672388 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670730 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:15:59.672388 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670732 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:15:59.672877 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670735 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:15:59.672877 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670738 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:15:59.672877 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670740 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:15:59.672877 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670743 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:15:59.672877 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670745 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:15:59.672877 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670748 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:15:59.672877 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670750 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:15:59.672877 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670753 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:15:59.672877 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670755 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:15:59.672877 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670758 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:15:59.672877 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670761 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:15:59.672877 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670764 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:15:59.672877 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670766 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:15:59.672877 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670770 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:15:59.672877 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670774 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:15:59.672877 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670777 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:15:59.672877 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670781 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:15:59.672877 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670783 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:15:59.672877 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670786 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:15:59.673352 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670788 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:15:59.673352 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670791 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:15:59.673352 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.670797 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:15:59.673352 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670889 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:15:59.673352 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670894 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:15:59.673352 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670898 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:15:59.673352 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670901 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:15:59.673352 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670905 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:15:59.673352 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670909 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:15:59.673352 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670912 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:15:59.673352 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670915 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:15:59.673352 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670919 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:15:59.673352 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670922 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:15:59.673352 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670925 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:15:59.673352 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670927 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:15:59.673352 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670930 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:15:59.673844 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670933 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:15:59.673844 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670935 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:15:59.673844 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670938 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:15:59.673844 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670940 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:15:59.673844 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670943 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:15:59.673844 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670945 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:15:59.673844 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670948 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:15:59.673844 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670950 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:15:59.673844 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670953 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:15:59.673844 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670957 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:15:59.673844 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670959 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:15:59.673844 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670962 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:15:59.673844 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670965 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:15:59.673844 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670967 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:15:59.673844 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670970 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:15:59.673844 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670973 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:15:59.673844 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670975 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:15:59.673844 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670978 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:15:59.673844 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670981 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:15:59.673844 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670984 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:15:59.674350 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670987 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:15:59.674350 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670990 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:15:59.674350 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670993 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:15:59.674350 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670996 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:15:59.674350 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.670998 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:15:59.674350 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671000 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:15:59.674350 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671003 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:15:59.674350 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671006 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:15:59.674350 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671009 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:15:59.674350 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671011 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:15:59.674350 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671014 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:15:59.674350 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671017 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:15:59.674350 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671019 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:15:59.674350 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671022 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:15:59.674350 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671024 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:15:59.674350 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671026 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:15:59.674350 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671029 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:15:59.674350 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671032 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:15:59.674350 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671034 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:15:59.674829 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671038 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:15:59.674829 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671041 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:15:59.674829 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671044 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:15:59.674829 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671048 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:15:59.674829 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671051 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:15:59.674829 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671054 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:15:59.674829 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671056 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:15:59.674829 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671059 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:15:59.674829 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671061 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:15:59.674829 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671064 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:15:59.674829 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671067 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:15:59.674829 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671070 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:15:59.674829 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671072 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:15:59.674829 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671076 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:15:59.674829 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671078 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:15:59.674829 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671081 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:15:59.674829 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671084 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:15:59.674829 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671086 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:15:59.674829 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671089 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:15:59.675287 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671092 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:15:59.675287 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671095 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:15:59.675287 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671098 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:15:59.675287 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671100 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:15:59.675287 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671103 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:15:59.675287 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671105 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:15:59.675287 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671108 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:15:59.675287 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671110 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:15:59.675287 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671113 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:15:59.675287 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671115 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:15:59.675287 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671118 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:15:59.675287 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671120 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:15:59.675287 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671123 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:15:59.675287 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671125 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:15:59.675287 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:15:59.671128 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:15:59.675681 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.671132 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:15:59.675681 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.671236 2581 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:15:59.675681 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.673237 2581 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:15:59.675681 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.674761 2581 server.go:1019] "Starting client certificate rotation" Apr 24 21:15:59.675681 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.674859 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:15:59.675681 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.674901 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:15:59.701783 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.701766 2581 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:15:59.703543 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.703516 2581 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:15:59.715342 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.715325 2581 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:15:59.720907 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.720891 2581 log.go:25] "Validated CRI v1 image API" Apr 24 21:15:59.723085 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.723066 2581 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:15:59.727247 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.727227 2581 fs.go:135] Filesystem UUIDs: map[131759a2-6264-496b-8fbc-e03cf756790b:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 c2f22b17-8f65-4a6b-9c99-ae28210e922a:/dev/nvme0n1p3] Apr 24 21:15:59.727319 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.727246 2581 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:15:59.733729 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.733391 2581 manager.go:217] Machine: {Timestamp:2026-04-24 21:15:59.731017801 +0000 UTC m=+0.409854464 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3093281 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28f1870081a1e2cc66116efa21e3e9 SystemUUID:ec28f187-0081-a1e2-cc66-116efa21e3e9 BootID:dacb31be-8aed-49f1-b338-e2e5be46fa09 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:bd:60:4f:2a:55 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:bd:60:4f:2a:55 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:62:ac:4e:09:c3:76 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:15:59.733729 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.733716 2581 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:15:59.733876 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.733783 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:15:59.733876 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.733813 2581 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:15:59.734863 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.734839 2581 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:15:59.734991 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.734866 2581 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-15.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:15:59.735040 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.735000 2581 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:15:59.735040 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.735009 2581 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:15:59.735040 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.735021 2581 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:15:59.736083 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.736071 2581 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:15:59.737627 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.737617 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:15:59.737741 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.737732 2581 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:15:59.740349 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.740340 2581 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:15:59.740390 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.740352 2581 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:15:59.740390 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.740364 2581 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:15:59.740390 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.740373 2581 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:15:59.740390 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.740380 2581 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:15:59.741649 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.741638 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:15:59.741695 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.741656 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:15:59.745105 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.745084 2581 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:15:59.746589 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.746575 2581 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:15:59.748538 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.748526 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:15:59.748581 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.748543 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:15:59.748581 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.748549 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:15:59.748581 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.748555 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:15:59.748581 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.748561 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:15:59.748581 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.748567 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:15:59.748581 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.748572 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:15:59.748581 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.748577 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:15:59.748581 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.748584 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:15:59.748787 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.748590 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:15:59.748787 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.748599 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:15:59.748787 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.748608 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:15:59.749575 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.749564 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:15:59.749575 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.749574 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:15:59.753191 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.753080 2581 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:15:59.753251 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.753214 2581 server.go:1295] "Started kubelet" Apr 24 21:15:59.753336 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.753304 2581 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:15:59.753410 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.753357 2581 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:15:59.753463 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.753439 2581 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:15:59.754044 ip-10-0-139-15 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:15:59.754620 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.754268 2581 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:15:59.754880 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:15:59.754832 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-15.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:15:59.754950 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.754912 2581 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:15:59.755088 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.754998 2581 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-15.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:15:59.755088 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:15:59.755061 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:15:59.761312 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:15:59.760221 2581 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-15.ec2.internal.18a9678786bd99f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-15.ec2.internal,UID:ip-10-0-139-15.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-139-15.ec2.internal,},FirstTimestamp:2026-04-24 21:15:59.753189878 +0000 UTC m=+0.432026540,LastTimestamp:2026-04-24 21:15:59.753189878 +0000 UTC m=+0.432026540,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-15.ec2.internal,}" Apr 24 21:15:59.761740 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.761723 2581 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:15:59.761831 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.761741 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:15:59.762646 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.762629 2581 factory.go:153] Registering CRI-O factory Apr 24 21:15:59.762722 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.762654 2581 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:15:59.762722 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.762673 2581 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:15:59.762722 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.762703 2581 factory.go:223] Registration of the crio container factory successfully Apr 24 21:15:59.762851 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.762725 2581 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:15:59.762851 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.762751 2581 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:15:59.762851 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.762771 2581 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:15:59.762851 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.762781 2581 factory.go:55] Registering systemd factory Apr 24 21:15:59.762851 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.762789 2581 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:15:59.762851 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.762810 2581 factory.go:103] Registering Raw factory Apr 24 21:15:59.762851 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.762636 2581 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:15:59.762851 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.762823 2581 manager.go:1196] Started watching for new ooms in manager Apr 24 21:15:59.763182 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:15:59.762926 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-15.ec2.internal\" not found" Apr 24 21:15:59.763951 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.763935 2581 manager.go:319] Starting recovery of all containers Apr 24 21:15:59.767500 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:15:59.767377 2581 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:15:59.770550 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:15:59.770523 2581 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-139-15.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:15:59.770747 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:15:59.770725 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:15:59.776229 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.776207 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-r5v8r" Apr 24 21:15:59.776891 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.776875 2581 manager.go:324] Recovery completed Apr 24 21:15:59.780816 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.780803 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:15:59.783582 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.783568 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-r5v8r" Apr 24 21:15:59.784192 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.784175 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-15.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:15:59.784273 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.784202 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:15:59.784273 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.784212 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-15.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:15:59.784674 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.784657 2581 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:15:59.784674 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.784671 2581 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:15:59.784787 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.784686 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:15:59.785985 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:15:59.785919 2581 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-15.ec2.internal.18a9678788969fe5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-15.ec2.internal,UID:ip-10-0-139-15.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-139-15.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-139-15.ec2.internal,},FirstTimestamp:2026-04-24 21:15:59.784189925 +0000 UTC m=+0.463026588,LastTimestamp:2026-04-24 21:15:59.784189925 +0000 UTC m=+0.463026588,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-15.ec2.internal,}" Apr 24 21:15:59.788029 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.788016 2581 policy_none.go:49] "None policy: Start" Apr 24 21:15:59.788081 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.788034 2581 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:15:59.788081 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.788044 2581 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:15:59.826955 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.826939 2581 manager.go:341] "Starting Device Plugin manager" Apr 24 21:15:59.839175 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:15:59.827009 2581 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:15:59.839175 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.827022 2581 server.go:85] "Starting device plugin registration server" Apr 24 21:15:59.839175 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.827193 2581 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:15:59.839175 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.827203 2581 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:15:59.839175 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.827292 2581 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:15:59.839175 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.827369 2581 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:15:59.839175 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.827381 2581 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:15:59.839175 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:15:59.827793 2581 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:15:59.839175 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:15:59.827827 2581 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-15.ec2.internal\" not found" Apr 24 21:15:59.867401 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.867377 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:15:59.868558 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.868541 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:15:59.868639 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.868566 2581 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:15:59.868639 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.868585 2581 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:15:59.868639 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.868591 2581 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:15:59.868639 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:15:59.868621 2581 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:15:59.871280 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.871262 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:15:59.927699 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.927652 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:15:59.928570 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.928556 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-15.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:15:59.928656 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.928582 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:15:59.928656 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.928593 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-15.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:15:59.928656 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.928613 2581 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-15.ec2.internal" Apr 24 21:15:59.937719 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.937699 2581 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-15.ec2.internal" Apr 24 21:15:59.937804 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:15:59.937723 2581 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-15.ec2.internal\": node \"ip-10-0-139-15.ec2.internal\" not found" Apr 24 21:15:59.949980 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:15:59.949961 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-15.ec2.internal\" not found" Apr 24 21:15:59.968956 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.968934 2581 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-15.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-15.ec2.internal"] Apr 24 21:15:59.969035 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.969018 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:15:59.969799 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.969786 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-15.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:15:59.969883 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.969817 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:15:59.969883 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.969831 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-15.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:15:59.971143 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.971129 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:15:59.971306 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.971292 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-15.ec2.internal" Apr 24 21:15:59.971353 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.971321 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:15:59.971900 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.971884 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-15.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:15:59.971980 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.971905 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:15:59.971980 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.971923 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-15.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:15:59.971980 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.971931 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-15.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:15:59.971980 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.971960 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:15:59.971980 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.971975 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-15.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:15:59.973296 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.973281 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-15.ec2.internal" Apr 24 21:15:59.973362 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.973311 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:15:59.975089 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.975073 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-15.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:15:59.975156 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.975102 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-15.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:15:59.975156 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:15:59.975116 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-15.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:00.005981 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:00.005962 2581 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-15.ec2.internal\" not found" node="ip-10-0-139-15.ec2.internal" Apr 24 21:16:00.010299 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:00.010286 2581 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-15.ec2.internal\" not found" node="ip-10-0-139-15.ec2.internal" Apr 24 21:16:00.050899 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:00.050881 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-15.ec2.internal\" not found" Apr 24 21:16:00.065589 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:00.065569 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/099fd95062106834f37953dba57d38ac-config\") pod \"kube-apiserver-proxy-ip-10-0-139-15.ec2.internal\" (UID: \"099fd95062106834f37953dba57d38ac\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-15.ec2.internal" Apr 24 21:16:00.065649 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:00.065592 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b91d10f43033d8a8aebe3d36b0cb37a8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-15.ec2.internal\" (UID: \"b91d10f43033d8a8aebe3d36b0cb37a8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-15.ec2.internal" Apr 24 21:16:00.065649 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:00.065613 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b91d10f43033d8a8aebe3d36b0cb37a8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-15.ec2.internal\" (UID: \"b91d10f43033d8a8aebe3d36b0cb37a8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-15.ec2.internal" Apr 24 21:16:00.151731 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:00.151705 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-15.ec2.internal\" not found" Apr 24 21:16:00.166099 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:00.166064 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/099fd95062106834f37953dba57d38ac-config\") pod \"kube-apiserver-proxy-ip-10-0-139-15.ec2.internal\" (UID: \"099fd95062106834f37953dba57d38ac\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-15.ec2.internal" Apr 24 21:16:00.166171 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:00.166114 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b91d10f43033d8a8aebe3d36b0cb37a8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-15.ec2.internal\" (UID: \"b91d10f43033d8a8aebe3d36b0cb37a8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-15.ec2.internal" Apr 24 21:16:00.166171 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:00.166139 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b91d10f43033d8a8aebe3d36b0cb37a8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-15.ec2.internal\" (UID: \"b91d10f43033d8a8aebe3d36b0cb37a8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-15.ec2.internal" Apr 24 21:16:00.166232 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:00.166166 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/099fd95062106834f37953dba57d38ac-config\") pod \"kube-apiserver-proxy-ip-10-0-139-15.ec2.internal\" (UID: \"099fd95062106834f37953dba57d38ac\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-15.ec2.internal" Apr 24 21:16:00.166232 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:00.166192 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b91d10f43033d8a8aebe3d36b0cb37a8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-15.ec2.internal\" (UID: \"b91d10f43033d8a8aebe3d36b0cb37a8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-15.ec2.internal" Apr 24 21:16:00.166310 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:00.166251 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b91d10f43033d8a8aebe3d36b0cb37a8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-15.ec2.internal\" (UID: \"b91d10f43033d8a8aebe3d36b0cb37a8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-15.ec2.internal" Apr 24 21:16:00.252504 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:00.252450 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-15.ec2.internal\" not found" Apr 24 21:16:00.308004 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:00.307975 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-15.ec2.internal" Apr 24 21:16:00.312608 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:00.312592 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-15.ec2.internal" Apr 24 21:16:00.353087 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:00.353068 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-15.ec2.internal\" not found" Apr 24 21:16:00.453648 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:00.453615 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-15.ec2.internal\" not found" Apr 24 21:16:00.554207 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:00.554150 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-15.ec2.internal\" not found" Apr 24 21:16:00.654718 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:00.654688 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-15.ec2.internal\" not found" Apr 24 21:16:00.674192 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:00.674171 2581 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:16:00.674332 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:00.674314 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:16:00.755762 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:00.755730 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-15.ec2.internal\" not found" Apr 24 21:16:00.762342 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:00.762325 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:16:00.786097 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:00.786055 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:10:59 +0000 UTC" deadline="2028-01-11 00:00:52.814619316 +0000 UTC" Apr 24 21:16:00.786097 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:00.786093 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15026h44m52.028530615s" Apr 24 21:16:00.786736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:00.786721 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:16:00.816397 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:00.816349 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-6zw8j" Apr 24 21:16:00.825965 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:00.825950 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-6zw8j" Apr 24 21:16:00.856097 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:00.856076 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-15.ec2.internal\" not found" Apr 24 21:16:00.945264 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:00.945230 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod099fd95062106834f37953dba57d38ac.slice/crio-19799229def1e0e665939b70554d66ec6b6115f55d67fd71d78a4d593699c3a8 WatchSource:0}: Error finding container 19799229def1e0e665939b70554d66ec6b6115f55d67fd71d78a4d593699c3a8: Status 404 returned error can't find the container with id 19799229def1e0e665939b70554d66ec6b6115f55d67fd71d78a4d593699c3a8 Apr 24 21:16:00.946019 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:00.945998 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb91d10f43033d8a8aebe3d36b0cb37a8.slice/crio-ceafea538d052fa2b3a50788ed456b980cd84577a1348951d1d935f683d3e443 WatchSource:0}: Error finding container ceafea538d052fa2b3a50788ed456b980cd84577a1348951d1d935f683d3e443: Status 404 returned error can't find the container with id ceafea538d052fa2b3a50788ed456b980cd84577a1348951d1d935f683d3e443 Apr 24 21:16:00.950984 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:00.950970 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:16:00.956930 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:00.956914 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-15.ec2.internal\" not found" Apr 24 21:16:01.011985 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.011963 2581 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:01.033890 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.033865 2581 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:01.057646 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:01.057623 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-15.ec2.internal\" not found" Apr 24 21:16:01.149495 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.149400 2581 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:01.162236 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.162212 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-15.ec2.internal" Apr 24 21:16:01.175711 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.175692 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:16:01.176641 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.176630 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-15.ec2.internal" Apr 24 21:16:01.184478 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.184465 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:16:01.613668 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.613389 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:01.742077 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.742050 2581 apiserver.go:52] "Watching apiserver" Apr 24 21:16:01.749214 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.749191 2581 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:16:01.751202 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.751171 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-kgjxk","kube-system/kube-apiserver-proxy-ip-10-0-139-15.ec2.internal","openshift-dns/node-resolver-5pvk7","openshift-network-diagnostics/network-check-target-5shjj","openshift-network-operator/iptables-alerter-4ln8q","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg","openshift-cluster-node-tuning-operator/tuned-bl44b","openshift-image-registry/node-ca-nwpfl","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-15.ec2.internal","openshift-multus/multus-9rv78","openshift-multus/multus-additional-cni-plugins-5shvq","openshift-multus/network-metrics-daemon-n487x","openshift-ovn-kubernetes/ovnkube-node-qnlsv"] Apr 24 21:16:01.753757 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.753734 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.755897 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.755869 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nwpfl" Apr 24 21:16:01.756355 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.756337 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:01.756446 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.756379 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:01.756536 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.756513 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-s2wnh\"" Apr 24 21:16:01.757666 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.757643 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:16:01.758304 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.757958 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:16:01.758304 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.758139 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:16:01.758523 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.758503 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-tz8dc\"" Apr 24 21:16:01.760329 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.760308 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kgjxk" Apr 24 21:16:01.763214 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.763191 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:01.763316 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:01.763260 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5shjj" podUID="9018a4db-1967-45ae-8ad3-7fdd04d6a4d1" Apr 24 21:16:01.763596 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.763578 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-k77jb\"" Apr 24 21:16:01.763672 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.763625 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:16:01.763813 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.763794 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:16:01.765655 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.765635 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln8q" Apr 24 21:16:01.765797 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.765726 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:01.767489 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.767472 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:01.768691 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.768036 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:16:01.768691 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.768098 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-pml99\"" Apr 24 21:16:01.768691 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.768285 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-n99pp\"" Apr 24 21:16:01.768691 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.768310 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:16:01.768691 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.768481 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:01.768691 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.768502 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:16:01.768691 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.768683 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:16:01.770735 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.770349 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5pvk7" Apr 24 21:16:01.772373 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.772346 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:16:01.772545 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.772529 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:16:01.772608 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.772580 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-9hfvp\"" Apr 24 21:16:01.772673 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.772658 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.774225 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.774132 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb75q\" (UniqueName: \"kubernetes.io/projected/9018a4db-1967-45ae-8ad3-7fdd04d6a4d1-kube-api-access-hb75q\") pod \"network-check-target-5shjj\" (UID: \"9018a4db-1967-45ae-8ad3-7fdd04d6a4d1\") " pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:01.774225 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.774168 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d7dffc39-699b-4872-afc1-cfff9e51ea9d-socket-dir\") pod \"aws-ebs-csi-driver-node-gxrpg\" (UID: \"d7dffc39-699b-4872-afc1-cfff9e51ea9d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:01.774225 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.774196 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d7dffc39-699b-4872-afc1-cfff9e51ea9d-device-dir\") pod \"aws-ebs-csi-driver-node-gxrpg\" (UID: \"d7dffc39-699b-4872-afc1-cfff9e51ea9d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:01.774225 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.774222 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42h2v\" (UniqueName: \"kubernetes.io/projected/d7dffc39-699b-4872-afc1-cfff9e51ea9d-kube-api-access-42h2v\") pod \"aws-ebs-csi-driver-node-gxrpg\" (UID: \"d7dffc39-699b-4872-afc1-cfff9e51ea9d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:01.774519 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.774254 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-etc-kubernetes\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.774519 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.774279 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/aff945d8-1222-4254-8b87-2cd6e5517284-agent-certs\") pod \"konnectivity-agent-kgjxk\" (UID: \"aff945d8-1222-4254-8b87-2cd6e5517284\") " pod="kube-system/konnectivity-agent-kgjxk" Apr 24 21:16:01.774519 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.774301 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d7dffc39-699b-4872-afc1-cfff9e51ea9d-registration-dir\") pod \"aws-ebs-csi-driver-node-gxrpg\" (UID: \"d7dffc39-699b-4872-afc1-cfff9e51ea9d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:01.774519 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.774325 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-etc-systemd\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.774519 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.774347 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3d5446f-bdf9-4989-9626-7608937edfa8-host-slash\") pod \"iptables-alerter-4ln8q\" (UID: \"a3d5446f-bdf9-4989-9626-7608937edfa8\") " pod="openshift-network-operator/iptables-alerter-4ln8q" Apr 24 21:16:01.774519 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.774368 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-sys\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.774519 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.774391 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jltt2\" (UniqueName: \"kubernetes.io/projected/801b2a6e-b16d-4e63-b007-af7d6c9273f5-kube-api-access-jltt2\") pod \"node-ca-nwpfl\" (UID: \"801b2a6e-b16d-4e63-b007-af7d6c9273f5\") " pod="openshift-image-registry/node-ca-nwpfl" Apr 24 21:16:01.774842 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.774413 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d7dffc39-699b-4872-afc1-cfff9e51ea9d-etc-selinux\") pod \"aws-ebs-csi-driver-node-gxrpg\" (UID: \"d7dffc39-699b-4872-afc1-cfff9e51ea9d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:01.774842 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.774715 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d7dffc39-699b-4872-afc1-cfff9e51ea9d-sys-fs\") pod \"aws-ebs-csi-driver-node-gxrpg\" (UID: \"d7dffc39-699b-4872-afc1-cfff9e51ea9d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:01.774842 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.774757 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-lib-modules\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.774842 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.774803 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5073b991-08af-46af-8888-2f0923d2dfac-etc-tuned\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.775028 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.774849 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:16:01.775028 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.774855 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vldl6\"" Apr 24 21:16:01.775028 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.774838 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/801b2a6e-b16d-4e63-b007-af7d6c9273f5-host\") pod \"node-ca-nwpfl\" (UID: \"801b2a6e-b16d-4e63-b007-af7d6c9273f5\") " pod="openshift-image-registry/node-ca-nwpfl" Apr 24 21:16:01.775028 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.774884 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:16:01.775028 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.774901 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a3d5446f-bdf9-4989-9626-7608937edfa8-iptables-alerter-script\") pod \"iptables-alerter-4ln8q\" (UID: \"a3d5446f-bdf9-4989-9626-7608937edfa8\") " pod="openshift-network-operator/iptables-alerter-4ln8q" Apr 24 21:16:01.775028 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.774946 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-etc-sysconfig\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.775028 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.774975 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-etc-sysctl-d\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.775028 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.775011 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-run\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.775363 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.775046 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/aff945d8-1222-4254-8b87-2cd6e5517284-konnectivity-ca\") pod \"konnectivity-agent-kgjxk\" (UID: \"aff945d8-1222-4254-8b87-2cd6e5517284\") " pod="kube-system/konnectivity-agent-kgjxk" Apr 24 21:16:01.775735 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.775709 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d7dffc39-699b-4872-afc1-cfff9e51ea9d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gxrpg\" (UID: \"d7dffc39-699b-4872-afc1-cfff9e51ea9d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:01.775818 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.775772 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-var-lib-kubelet\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.775867 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.775813 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5jl9\" (UniqueName: \"kubernetes.io/projected/5073b991-08af-46af-8888-2f0923d2dfac-kube-api-access-t5jl9\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.775867 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.775847 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/801b2a6e-b16d-4e63-b007-af7d6c9273f5-serviceca\") pod \"node-ca-nwpfl\" (UID: \"801b2a6e-b16d-4e63-b007-af7d6c9273f5\") " pod="openshift-image-registry/node-ca-nwpfl" Apr 24 21:16:01.775960 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.775881 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ljjz\" (UniqueName: \"kubernetes.io/projected/a3d5446f-bdf9-4989-9626-7608937edfa8-kube-api-access-2ljjz\") pod \"iptables-alerter-4ln8q\" (UID: \"a3d5446f-bdf9-4989-9626-7608937edfa8\") " pod="openshift-network-operator/iptables-alerter-4ln8q" Apr 24 21:16:01.775960 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.775913 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-etc-modprobe-d\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.776070 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.776049 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:16:01.776335 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.776315 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:16:01.776583 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.776560 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-etc-sysctl-conf\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.776667 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.776610 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-host\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.776760 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.776661 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5073b991-08af-46af-8888-2f0923d2dfac-tmp\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.776825 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.776750 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:01.777247 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:01.777215 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n487x" podUID="657a2c9b-4e75-4d61-bff2-d8abdd05825d" Apr 24 21:16:01.777394 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.777335 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.780469 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.780403 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:16:01.780626 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.780611 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:16:01.780739 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.780719 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.780837 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.780818 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-tb85r\"" Apr 24 21:16:01.783461 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.783282 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:16:01.783461 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.783388 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:16:01.783658 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.783622 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:16:01.783727 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.783708 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:16:01.783859 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.783834 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:16:01.783951 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.783913 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:16:01.783951 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.783366 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hhs9n\"" Apr 24 21:16:01.826721 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.826689 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:11:00 +0000 UTC" deadline="2027-12-05 08:00:30.218913376 +0000 UTC" Apr 24 21:16:01.826721 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.826721 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14146h44m28.392196334s" Apr 24 21:16:01.864315 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.864237 2581 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:16:01.872938 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.872893 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-15.ec2.internal" event={"ID":"099fd95062106834f37953dba57d38ac","Type":"ContainerStarted","Data":"19799229def1e0e665939b70554d66ec6b6115f55d67fd71d78a4d593699c3a8"} Apr 24 21:16:01.874193 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.874163 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-15.ec2.internal" event={"ID":"b91d10f43033d8a8aebe3d36b0cb37a8","Type":"ContainerStarted","Data":"ceafea538d052fa2b3a50788ed456b980cd84577a1348951d1d935f683d3e443"} Apr 24 21:16:01.877433 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.877392 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a3d5446f-bdf9-4989-9626-7608937edfa8-iptables-alerter-script\") pod \"iptables-alerter-4ln8q\" (UID: \"a3d5446f-bdf9-4989-9626-7608937edfa8\") " pod="openshift-network-operator/iptables-alerter-4ln8q" Apr 24 21:16:01.877540 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.877447 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-etc-sysctl-d\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.877540 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.877493 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-run\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.877540 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.877522 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-host-var-lib-cni-bin\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.877697 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.877594 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-var-lib-kubelet\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.877697 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.877617 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-run\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.877697 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.877623 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/801b2a6e-b16d-4e63-b007-af7d6c9273f5-serviceca\") pod \"node-ca-nwpfl\" (UID: \"801b2a6e-b16d-4e63-b007-af7d6c9273f5\") " pod="openshift-image-registry/node-ca-nwpfl" Apr 24 21:16:01.877697 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.877675 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-etc-sysctl-d\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.877889 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.877680 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-multus-cni-dir\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.877889 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.877732 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-hostroot\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.877889 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.877757 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-multus-conf-dir\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.877889 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.877756 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-var-lib-kubelet\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.877889 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.877781 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-host-run-ovn-kubernetes\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.877889 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.877807 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-env-overrides\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.877889 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.877833 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-etc-sysctl-conf\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.877889 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.877856 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-host\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.877889 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.877878 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5073b991-08af-46af-8888-2f0923d2dfac-tmp\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.878264 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.877903 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-ovn-node-metrics-cert\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.878264 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.877932 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hb75q\" (UniqueName: \"kubernetes.io/projected/9018a4db-1967-45ae-8ad3-7fdd04d6a4d1-kube-api-access-hb75q\") pod \"network-check-target-5shjj\" (UID: \"9018a4db-1967-45ae-8ad3-7fdd04d6a4d1\") " pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:01.878264 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.877958 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-systemd-units\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.878264 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.877981 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d7dffc39-699b-4872-afc1-cfff9e51ea9d-socket-dir\") pod \"aws-ebs-csi-driver-node-gxrpg\" (UID: \"d7dffc39-699b-4872-afc1-cfff9e51ea9d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:01.878264 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.877990 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-host\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.878264 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878004 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c61fee18-e272-4bf5-aa08-65392bba68b6-os-release\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.878264 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878030 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/aff945d8-1222-4254-8b87-2cd6e5517284-agent-certs\") pod \"konnectivity-agent-kgjxk\" (UID: \"aff945d8-1222-4254-8b87-2cd6e5517284\") " pod="kube-system/konnectivity-agent-kgjxk" Apr 24 21:16:01.878264 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878056 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d7dffc39-699b-4872-afc1-cfff9e51ea9d-registration-dir\") pod \"aws-ebs-csi-driver-node-gxrpg\" (UID: \"d7dffc39-699b-4872-afc1-cfff9e51ea9d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:01.878264 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878062 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/801b2a6e-b16d-4e63-b007-af7d6c9273f5-serviceca\") pod \"node-ca-nwpfl\" (UID: \"801b2a6e-b16d-4e63-b007-af7d6c9273f5\") " pod="openshift-image-registry/node-ca-nwpfl" Apr 24 21:16:01.878264 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878080 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-etc-systemd\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.878264 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878092 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a3d5446f-bdf9-4989-9626-7608937edfa8-iptables-alerter-script\") pod \"iptables-alerter-4ln8q\" (UID: \"a3d5446f-bdf9-4989-9626-7608937edfa8\") " pod="openshift-network-operator/iptables-alerter-4ln8q" Apr 24 21:16:01.878264 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878109 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-system-cni-dir\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.878264 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878150 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-host-var-lib-cni-multus\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.878264 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878181 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c61fee18-e272-4bf5-aa08-65392bba68b6-system-cni-dir\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.878264 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878220 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-etc-sysctl-conf\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.878264 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878229 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-etc-kubernetes\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.878264 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878261 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-sys\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.878989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878284 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-cnibin\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.878989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878301 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-etc-systemd\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.878989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878307 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-os-release\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.878989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878343 2581 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:16:01.878989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878379 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-host-var-lib-kubelet\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.878989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878410 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c61fee18-e272-4bf5-aa08-65392bba68b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.878989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878464 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zpvp\" (UniqueName: \"kubernetes.io/projected/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-kube-api-access-6zpvp\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.878989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878467 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-sys\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.878989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878469 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d7dffc39-699b-4872-afc1-cfff9e51ea9d-socket-dir\") pod \"aws-ebs-csi-driver-node-gxrpg\" (UID: \"d7dffc39-699b-4872-afc1-cfff9e51ea9d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:01.878989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878498 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d7dffc39-699b-4872-afc1-cfff9e51ea9d-registration-dir\") pod \"aws-ebs-csi-driver-node-gxrpg\" (UID: \"d7dffc39-699b-4872-afc1-cfff9e51ea9d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:01.878989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878527 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d7dffc39-699b-4872-afc1-cfff9e51ea9d-sys-fs\") pod \"aws-ebs-csi-driver-node-gxrpg\" (UID: \"d7dffc39-699b-4872-afc1-cfff9e51ea9d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:01.878989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878561 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/801b2a6e-b16d-4e63-b007-af7d6c9273f5-host\") pod \"node-ca-nwpfl\" (UID: \"801b2a6e-b16d-4e63-b007-af7d6c9273f5\") " pod="openshift-image-registry/node-ca-nwpfl" Apr 24 21:16:01.878989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878569 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d7dffc39-699b-4872-afc1-cfff9e51ea9d-sys-fs\") pod \"aws-ebs-csi-driver-node-gxrpg\" (UID: \"d7dffc39-699b-4872-afc1-cfff9e51ea9d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:01.878989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878588 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-multus-socket-dir-parent\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.878989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878610 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/801b2a6e-b16d-4e63-b007-af7d6c9273f5-host\") pod \"node-ca-nwpfl\" (UID: \"801b2a6e-b16d-4e63-b007-af7d6c9273f5\") " pod="openshift-image-registry/node-ca-nwpfl" Apr 24 21:16:01.878989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878613 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-host-run-multus-certs\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.878989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878654 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gml6h\" (UniqueName: \"kubernetes.io/projected/ff16a19a-c677-4d51-81d3-8d67d7ce1749-kube-api-access-gml6h\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.879772 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878685 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c61fee18-e272-4bf5-aa08-65392bba68b6-cnibin\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.879772 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878707 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-node-log\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.879772 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878732 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-etc-sysconfig\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.879772 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878754 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92521ad1-7ba9-4bdd-bc3b-f470cd17cfef-tmp-dir\") pod \"node-resolver-5pvk7\" (UID: \"92521ad1-7ba9-4bdd-bc3b-f470cd17cfef\") " pod="openshift-dns/node-resolver-5pvk7" Apr 24 21:16:01.879772 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878776 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c61fee18-e272-4bf5-aa08-65392bba68b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.879772 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878807 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwtrc\" (UniqueName: \"kubernetes.io/projected/c61fee18-e272-4bf5-aa08-65392bba68b6-kube-api-access-mwtrc\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.879772 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878836 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-etc-sysconfig\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.879772 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878890 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/aff945d8-1222-4254-8b87-2cd6e5517284-konnectivity-ca\") pod \"konnectivity-agent-kgjxk\" (UID: \"aff945d8-1222-4254-8b87-2cd6e5517284\") " pod="kube-system/konnectivity-agent-kgjxk" Apr 24 21:16:01.879772 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.878969 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d7dffc39-699b-4872-afc1-cfff9e51ea9d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gxrpg\" (UID: \"d7dffc39-699b-4872-afc1-cfff9e51ea9d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:01.879772 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.879105 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d7dffc39-699b-4872-afc1-cfff9e51ea9d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gxrpg\" (UID: \"d7dffc39-699b-4872-afc1-cfff9e51ea9d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:01.879772 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.879151 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5jl9\" (UniqueName: \"kubernetes.io/projected/5073b991-08af-46af-8888-2f0923d2dfac-kube-api-access-t5jl9\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.879772 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.879188 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-run-systemd\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.879772 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.879209 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-etc-openvswitch\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.879772 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.879240 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.879772 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.879266 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-ovnkube-config\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.879772 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.879296 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ljjz\" (UniqueName: \"kubernetes.io/projected/a3d5446f-bdf9-4989-9626-7608937edfa8-kube-api-access-2ljjz\") pod \"iptables-alerter-4ln8q\" (UID: \"a3d5446f-bdf9-4989-9626-7608937edfa8\") " pod="openshift-network-operator/iptables-alerter-4ln8q" Apr 24 21:16:01.880494 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.879324 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-etc-modprobe-d\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.880494 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.879351 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/92521ad1-7ba9-4bdd-bc3b-f470cd17cfef-hosts-file\") pod \"node-resolver-5pvk7\" (UID: \"92521ad1-7ba9-4bdd-bc3b-f470cd17cfef\") " pod="openshift-dns/node-resolver-5pvk7" Apr 24 21:16:01.880494 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.879410 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/aff945d8-1222-4254-8b87-2cd6e5517284-konnectivity-ca\") pod \"konnectivity-agent-kgjxk\" (UID: \"aff945d8-1222-4254-8b87-2cd6e5517284\") " pod="kube-system/konnectivity-agent-kgjxk" Apr 24 21:16:01.880494 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.879474 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-etc-modprobe-d\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.880494 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.879651 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff16a19a-c677-4d51-81d3-8d67d7ce1749-cni-binary-copy\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.880494 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.879679 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c61fee18-e272-4bf5-aa08-65392bba68b6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.880494 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.879717 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-run-openvswitch\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.880494 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.879768 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld9nn\" (UniqueName: \"kubernetes.io/projected/92521ad1-7ba9-4bdd-bc3b-f470cd17cfef-kube-api-access-ld9nn\") pod \"node-resolver-5pvk7\" (UID: \"92521ad1-7ba9-4bdd-bc3b-f470cd17cfef\") " pod="openshift-dns/node-resolver-5pvk7" Apr 24 21:16:01.880494 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.879809 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ff16a19a-c677-4d51-81d3-8d67d7ce1749-multus-daemon-config\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.880494 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.879834 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-log-socket\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.880494 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.879867 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d7dffc39-699b-4872-afc1-cfff9e51ea9d-device-dir\") pod \"aws-ebs-csi-driver-node-gxrpg\" (UID: \"d7dffc39-699b-4872-afc1-cfff9e51ea9d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:01.880494 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.879892 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42h2v\" (UniqueName: \"kubernetes.io/projected/d7dffc39-699b-4872-afc1-cfff9e51ea9d-kube-api-access-42h2v\") pod \"aws-ebs-csi-driver-node-gxrpg\" (UID: \"d7dffc39-699b-4872-afc1-cfff9e51ea9d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:01.880494 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.879966 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d7dffc39-699b-4872-afc1-cfff9e51ea9d-device-dir\") pod \"aws-ebs-csi-driver-node-gxrpg\" (UID: \"d7dffc39-699b-4872-afc1-cfff9e51ea9d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:01.880494 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880036 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-etc-kubernetes\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.880494 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880067 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-host-kubelet\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.880494 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880091 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-host-slash\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.880494 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880105 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-etc-kubernetes\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.881774 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880136 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-host-run-netns\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.881774 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880160 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-host-cni-netd\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.881774 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880185 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6mwm\" (UniqueName: \"kubernetes.io/projected/657a2c9b-4e75-4d61-bff2-d8abdd05825d-kube-api-access-t6mwm\") pod \"network-metrics-daemon-n487x\" (UID: \"657a2c9b-4e75-4d61-bff2-d8abdd05825d\") " pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:01.881774 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880217 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c61fee18-e272-4bf5-aa08-65392bba68b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.881774 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880243 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-run-ovn\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.881774 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880265 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-host-cni-bin\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.881774 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880290 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs\") pod \"network-metrics-daemon-n487x\" (UID: \"657a2c9b-4e75-4d61-bff2-d8abdd05825d\") " pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:01.881774 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880315 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3d5446f-bdf9-4989-9626-7608937edfa8-host-slash\") pod \"iptables-alerter-4ln8q\" (UID: \"a3d5446f-bdf9-4989-9626-7608937edfa8\") " pod="openshift-network-operator/iptables-alerter-4ln8q" Apr 24 21:16:01.881774 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880353 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jltt2\" (UniqueName: \"kubernetes.io/projected/801b2a6e-b16d-4e63-b007-af7d6c9273f5-kube-api-access-jltt2\") pod \"node-ca-nwpfl\" (UID: \"801b2a6e-b16d-4e63-b007-af7d6c9273f5\") " pod="openshift-image-registry/node-ca-nwpfl" Apr 24 21:16:01.881774 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880389 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-host-run-k8s-cni-cncf-io\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.881774 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880418 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-host-run-netns\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.881774 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880442 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3d5446f-bdf9-4989-9626-7608937edfa8-host-slash\") pod \"iptables-alerter-4ln8q\" (UID: \"a3d5446f-bdf9-4989-9626-7608937edfa8\") " pod="openshift-network-operator/iptables-alerter-4ln8q" Apr 24 21:16:01.881774 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880462 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-var-lib-openvswitch\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.881774 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880489 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d7dffc39-699b-4872-afc1-cfff9e51ea9d-etc-selinux\") pod \"aws-ebs-csi-driver-node-gxrpg\" (UID: \"d7dffc39-699b-4872-afc1-cfff9e51ea9d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:01.881774 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880516 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-lib-modules\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.881774 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880541 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5073b991-08af-46af-8888-2f0923d2dfac-etc-tuned\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.881774 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880567 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-ovnkube-script-lib\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.882624 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880630 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d7dffc39-699b-4872-afc1-cfff9e51ea9d-etc-selinux\") pod \"aws-ebs-csi-driver-node-gxrpg\" (UID: \"d7dffc39-699b-4872-afc1-cfff9e51ea9d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:01.882624 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.880693 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5073b991-08af-46af-8888-2f0923d2dfac-lib-modules\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.882624 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.881597 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/aff945d8-1222-4254-8b87-2cd6e5517284-agent-certs\") pod \"konnectivity-agent-kgjxk\" (UID: \"aff945d8-1222-4254-8b87-2cd6e5517284\") " pod="kube-system/konnectivity-agent-kgjxk" Apr 24 21:16:01.882624 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.881776 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5073b991-08af-46af-8888-2f0923d2dfac-tmp\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.884092 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.884072 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5073b991-08af-46af-8888-2f0923d2dfac-etc-tuned\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.886631 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:01.886611 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:01.886631 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:01.886631 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:01.886771 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:01.886645 2581 projected.go:194] Error preparing data for projected volume kube-api-access-hb75q for pod openshift-network-diagnostics/network-check-target-5shjj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:01.886771 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:01.886735 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9018a4db-1967-45ae-8ad3-7fdd04d6a4d1-kube-api-access-hb75q podName:9018a4db-1967-45ae-8ad3-7fdd04d6a4d1 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:02.386700047 +0000 UTC m=+3.065536699 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hb75q" (UniqueName: "kubernetes.io/projected/9018a4db-1967-45ae-8ad3-7fdd04d6a4d1-kube-api-access-hb75q") pod "network-check-target-5shjj" (UID: "9018a4db-1967-45ae-8ad3-7fdd04d6a4d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:01.891886 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.891822 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5jl9\" (UniqueName: \"kubernetes.io/projected/5073b991-08af-46af-8888-2f0923d2dfac-kube-api-access-t5jl9\") pod \"tuned-bl44b\" (UID: \"5073b991-08af-46af-8888-2f0923d2dfac\") " pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:01.892610 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.892583 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jltt2\" (UniqueName: \"kubernetes.io/projected/801b2a6e-b16d-4e63-b007-af7d6c9273f5-kube-api-access-jltt2\") pod \"node-ca-nwpfl\" (UID: \"801b2a6e-b16d-4e63-b007-af7d6c9273f5\") " pod="openshift-image-registry/node-ca-nwpfl" Apr 24 21:16:01.892853 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.892833 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ljjz\" (UniqueName: \"kubernetes.io/projected/a3d5446f-bdf9-4989-9626-7608937edfa8-kube-api-access-2ljjz\") pod \"iptables-alerter-4ln8q\" (UID: \"a3d5446f-bdf9-4989-9626-7608937edfa8\") " pod="openshift-network-operator/iptables-alerter-4ln8q" Apr 24 21:16:01.892921 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.892866 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42h2v\" (UniqueName: \"kubernetes.io/projected/d7dffc39-699b-4872-afc1-cfff9e51ea9d-kube-api-access-42h2v\") pod \"aws-ebs-csi-driver-node-gxrpg\" (UID: \"d7dffc39-699b-4872-afc1-cfff9e51ea9d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:01.981527 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981495 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-host-kubelet\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.981527 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981526 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-host-slash\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.981753 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981549 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-host-run-netns\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.981753 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981598 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-host-run-netns\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.981753 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981615 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-host-slash\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.981753 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981615 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-host-kubelet\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.981753 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981644 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-host-cni-netd\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.981753 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981678 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6mwm\" (UniqueName: \"kubernetes.io/projected/657a2c9b-4e75-4d61-bff2-d8abdd05825d-kube-api-access-t6mwm\") pod \"network-metrics-daemon-n487x\" (UID: \"657a2c9b-4e75-4d61-bff2-d8abdd05825d\") " pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:01.981753 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981688 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-host-cni-netd\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.981753 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981705 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c61fee18-e272-4bf5-aa08-65392bba68b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.981753 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981739 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-run-ovn\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.982175 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981763 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-host-cni-bin\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.982175 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981785 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs\") pod \"network-metrics-daemon-n487x\" (UID: \"657a2c9b-4e75-4d61-bff2-d8abdd05825d\") " pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:01.982175 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981813 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-host-run-k8s-cni-cncf-io\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.982175 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981825 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-run-ovn\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.982175 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981859 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c61fee18-e272-4bf5-aa08-65392bba68b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.982175 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981835 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-host-run-netns\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.982175 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981878 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-host-run-netns\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.982175 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981867 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-host-run-k8s-cni-cncf-io\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.982175 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981898 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-var-lib-openvswitch\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.982175 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981904 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-host-cni-bin\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.982175 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981926 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-ovnkube-script-lib\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.982175 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:01.981957 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:01.982175 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981970 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-var-lib-openvswitch\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.982175 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.981979 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-host-var-lib-cni-bin\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.982175 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982038 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-multus-cni-dir\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.982175 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:01.982066 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs podName:657a2c9b-4e75-4d61-bff2-d8abdd05825d nodeName:}" failed. No retries permitted until 2026-04-24 21:16:02.48203617 +0000 UTC m=+3.160872847 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs") pod "network-metrics-daemon-n487x" (UID: "657a2c9b-4e75-4d61-bff2-d8abdd05825d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:01.982175 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982093 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-host-var-lib-cni-bin\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.982908 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982099 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-multus-cni-dir\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.982908 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982095 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-hostroot\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.982908 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982133 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-hostroot\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.982908 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982139 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-multus-conf-dir\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.982908 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982168 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-host-run-ovn-kubernetes\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.982908 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982193 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-env-overrides\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.982908 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982220 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-ovn-node-metrics-cert\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.982908 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982253 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-host-run-ovn-kubernetes\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.982908 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982268 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-systemd-units\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.982908 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982293 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-multus-conf-dir\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.982908 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982310 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c61fee18-e272-4bf5-aa08-65392bba68b6-os-release\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.982908 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982314 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-systemd-units\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.982908 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982335 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-system-cni-dir\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.982908 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982351 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-host-var-lib-cni-multus\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.982908 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982368 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c61fee18-e272-4bf5-aa08-65392bba68b6-system-cni-dir\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.982908 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982388 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-etc-kubernetes\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.982908 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982405 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-cnibin\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.982908 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982419 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-os-release\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.983688 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982465 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-host-var-lib-kubelet\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.983688 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982484 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c61fee18-e272-4bf5-aa08-65392bba68b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.983688 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982498 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zpvp\" (UniqueName: \"kubernetes.io/projected/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-kube-api-access-6zpvp\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.983688 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982520 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-multus-socket-dir-parent\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.983688 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982541 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-host-run-multus-certs\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.983688 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982556 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gml6h\" (UniqueName: \"kubernetes.io/projected/ff16a19a-c677-4d51-81d3-8d67d7ce1749-kube-api-access-gml6h\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.983688 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982572 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c61fee18-e272-4bf5-aa08-65392bba68b6-cnibin\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.983688 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982599 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-ovnkube-script-lib\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.983688 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982620 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-node-log\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.983688 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982664 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-node-log\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.983688 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982665 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-os-release\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.983688 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982680 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c61fee18-e272-4bf5-aa08-65392bba68b6-os-release\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.983688 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982693 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92521ad1-7ba9-4bdd-bc3b-f470cd17cfef-tmp-dir\") pod \"node-resolver-5pvk7\" (UID: \"92521ad1-7ba9-4bdd-bc3b-f470cd17cfef\") " pod="openshift-dns/node-resolver-5pvk7" Apr 24 21:16:01.983688 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982710 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-env-overrides\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.983688 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982733 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-cnibin\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.983688 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982736 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-host-var-lib-kubelet\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.983688 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982739 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-host-var-lib-cni-multus\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.983688 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982775 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-system-cni-dir\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.984517 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982779 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c61fee18-e272-4bf5-aa08-65392bba68b6-system-cni-dir\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.984517 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982775 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-etc-kubernetes\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.984517 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982809 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-multus-socket-dir-parent\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.984517 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982821 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ff16a19a-c677-4d51-81d3-8d67d7ce1749-host-run-multus-certs\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.984517 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982826 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c61fee18-e272-4bf5-aa08-65392bba68b6-cnibin\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.984517 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982842 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c61fee18-e272-4bf5-aa08-65392bba68b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.984517 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982869 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mwtrc\" (UniqueName: \"kubernetes.io/projected/c61fee18-e272-4bf5-aa08-65392bba68b6-kube-api-access-mwtrc\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.984517 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982923 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-run-systemd\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.984517 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982949 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-etc-openvswitch\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.984517 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.982976 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.984517 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.983002 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-ovnkube-config\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.984517 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.983036 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/92521ad1-7ba9-4bdd-bc3b-f470cd17cfef-hosts-file\") pod \"node-resolver-5pvk7\" (UID: \"92521ad1-7ba9-4bdd-bc3b-f470cd17cfef\") " pod="openshift-dns/node-resolver-5pvk7" Apr 24 21:16:01.984517 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.983061 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff16a19a-c677-4d51-81d3-8d67d7ce1749-cni-binary-copy\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.984517 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.983085 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c61fee18-e272-4bf5-aa08-65392bba68b6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.984517 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.983109 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-run-openvswitch\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.984517 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.983122 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92521ad1-7ba9-4bdd-bc3b-f470cd17cfef-tmp-dir\") pod \"node-resolver-5pvk7\" (UID: \"92521ad1-7ba9-4bdd-bc3b-f470cd17cfef\") " pod="openshift-dns/node-resolver-5pvk7" Apr 24 21:16:01.984517 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.983133 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ld9nn\" (UniqueName: \"kubernetes.io/projected/92521ad1-7ba9-4bdd-bc3b-f470cd17cfef-kube-api-access-ld9nn\") pod \"node-resolver-5pvk7\" (UID: \"92521ad1-7ba9-4bdd-bc3b-f470cd17cfef\") " pod="openshift-dns/node-resolver-5pvk7" Apr 24 21:16:01.985256 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.983168 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ff16a19a-c677-4d51-81d3-8d67d7ce1749-multus-daemon-config\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.985256 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.983195 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-log-socket\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.985256 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.983217 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-etc-openvswitch\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.985256 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.983260 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-log-socket\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.985256 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.983278 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.985256 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.983361 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c61fee18-e272-4bf5-aa08-65392bba68b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.985256 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.983389 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-run-openvswitch\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.985256 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.983417 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/92521ad1-7ba9-4bdd-bc3b-f470cd17cfef-hosts-file\") pod \"node-resolver-5pvk7\" (UID: \"92521ad1-7ba9-4bdd-bc3b-f470cd17cfef\") " pod="openshift-dns/node-resolver-5pvk7" Apr 24 21:16:01.985256 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.983449 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-run-systemd\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.985256 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.983697 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-ovnkube-config\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.985256 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.983784 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ff16a19a-c677-4d51-81d3-8d67d7ce1749-multus-daemon-config\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.985256 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.983794 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c61fee18-e272-4bf5-aa08-65392bba68b6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.985256 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.983861 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff16a19a-c677-4d51-81d3-8d67d7ce1749-cni-binary-copy\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.985256 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.983881 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c61fee18-e272-4bf5-aa08-65392bba68b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.985256 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.985208 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-ovn-node-metrics-cert\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:01.990972 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.990945 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6mwm\" (UniqueName: \"kubernetes.io/projected/657a2c9b-4e75-4d61-bff2-d8abdd05825d-kube-api-access-t6mwm\") pod \"network-metrics-daemon-n487x\" (UID: \"657a2c9b-4e75-4d61-bff2-d8abdd05825d\") " pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:01.991873 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.991850 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld9nn\" (UniqueName: \"kubernetes.io/projected/92521ad1-7ba9-4bdd-bc3b-f470cd17cfef-kube-api-access-ld9nn\") pod \"node-resolver-5pvk7\" (UID: \"92521ad1-7ba9-4bdd-bc3b-f470cd17cfef\") " pod="openshift-dns/node-resolver-5pvk7" Apr 24 21:16:01.992152 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.992133 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwtrc\" (UniqueName: \"kubernetes.io/projected/c61fee18-e272-4bf5-aa08-65392bba68b6-kube-api-access-mwtrc\") pod \"multus-additional-cni-plugins-5shvq\" (UID: \"c61fee18-e272-4bf5-aa08-65392bba68b6\") " pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:01.992232 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.992211 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gml6h\" (UniqueName: \"kubernetes.io/projected/ff16a19a-c677-4d51-81d3-8d67d7ce1749-kube-api-access-gml6h\") pod \"multus-9rv78\" (UID: \"ff16a19a-c677-4d51-81d3-8d67d7ce1749\") " pod="openshift-multus/multus-9rv78" Apr 24 21:16:01.992275 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:01.992234 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zpvp\" (UniqueName: \"kubernetes.io/projected/56d7cab8-8a8d-47a6-81da-f1f67f4aed59-kube-api-access-6zpvp\") pod \"ovnkube-node-qnlsv\" (UID: \"56d7cab8-8a8d-47a6-81da-f1f67f4aed59\") " pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:02.066335 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.066292 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bl44b" Apr 24 21:16:02.073764 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.073741 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nwpfl" Apr 24 21:16:02.084326 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.084301 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kgjxk" Apr 24 21:16:02.092877 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.092860 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln8q" Apr 24 21:16:02.098410 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.098393 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" Apr 24 21:16:02.105960 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.105940 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5pvk7" Apr 24 21:16:02.114521 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.114455 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9rv78" Apr 24 21:16:02.120955 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.120937 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5shvq" Apr 24 21:16:02.127540 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.127522 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:02.486885 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.486803 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs\") pod \"network-metrics-daemon-n487x\" (UID: \"657a2c9b-4e75-4d61-bff2-d8abdd05825d\") " pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:02.486885 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.486855 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hb75q\" (UniqueName: \"kubernetes.io/projected/9018a4db-1967-45ae-8ad3-7fdd04d6a4d1-kube-api-access-hb75q\") pod \"network-check-target-5shjj\" (UID: \"9018a4db-1967-45ae-8ad3-7fdd04d6a4d1\") " pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:02.487084 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:02.486979 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:02.487084 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:02.487042 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs podName:657a2c9b-4e75-4d61-bff2-d8abdd05825d nodeName:}" failed. No retries permitted until 2026-04-24 21:16:03.487022817 +0000 UTC m=+4.165859470 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs") pod "network-metrics-daemon-n487x" (UID: "657a2c9b-4e75-4d61-bff2-d8abdd05825d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:02.487189 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:02.487090 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:02.487189 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:02.487110 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:02.487189 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:02.487123 2581 projected.go:194] Error preparing data for projected volume kube-api-access-hb75q for pod openshift-network-diagnostics/network-check-target-5shjj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:02.487189 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:02.487171 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9018a4db-1967-45ae-8ad3-7fdd04d6a4d1-kube-api-access-hb75q podName:9018a4db-1967-45ae-8ad3-7fdd04d6a4d1 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:03.487154964 +0000 UTC m=+4.165991621 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-hb75q" (UniqueName: "kubernetes.io/projected/9018a4db-1967-45ae-8ad3-7fdd04d6a4d1-kube-api-access-hb75q") pod "network-check-target-5shjj" (UID: "9018a4db-1967-45ae-8ad3-7fdd04d6a4d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:02.695616 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:02.695582 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3d5446f_bdf9_4989_9626_7608937edfa8.slice/crio-bbba660776bc26b3dce72e96296f0e7495f4ffcd2bdcd0f3894ba3975d2e42e5 WatchSource:0}: Error finding container bbba660776bc26b3dce72e96296f0e7495f4ffcd2bdcd0f3894ba3975d2e42e5: Status 404 returned error can't find the container with id bbba660776bc26b3dce72e96296f0e7495f4ffcd2bdcd0f3894ba3975d2e42e5 Apr 24 21:16:02.697824 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:02.697676 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92521ad1_7ba9_4bdd_bc3b_f470cd17cfef.slice/crio-d83eb7b4d00ee15e2dc7112dc0aefc1df74a17c5cb63912af881225a6684507a WatchSource:0}: Error finding container d83eb7b4d00ee15e2dc7112dc0aefc1df74a17c5cb63912af881225a6684507a: Status 404 returned error can't find the container with id d83eb7b4d00ee15e2dc7112dc0aefc1df74a17c5cb63912af881225a6684507a Apr 24 21:16:02.700831 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:02.700800 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod801b2a6e_b16d_4e63_b007_af7d6c9273f5.slice/crio-be2cbf27d55b90aefc368393f30229d69a81cb176d6b53889607515e31f37919 WatchSource:0}: Error finding container be2cbf27d55b90aefc368393f30229d69a81cb176d6b53889607515e31f37919: Status 404 returned error can't find the container with id be2cbf27d55b90aefc368393f30229d69a81cb176d6b53889607515e31f37919 Apr 24 21:16:02.701509 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:02.701482 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff16a19a_c677_4d51_81d3_8d67d7ce1749.slice/crio-c759ffa468c68d80589a71267fdfe760ba9dd3fd3995b3142b7fc51447009a7b WatchSource:0}: Error finding container c759ffa468c68d80589a71267fdfe760ba9dd3fd3995b3142b7fc51447009a7b: Status 404 returned error can't find the container with id c759ffa468c68d80589a71267fdfe760ba9dd3fd3995b3142b7fc51447009a7b Apr 24 21:16:02.703287 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:02.703261 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56d7cab8_8a8d_47a6_81da_f1f67f4aed59.slice/crio-fe27b48ea894a19fc652e4f7b9d1a5789cc3ceb05156c750d2d6551827f507a5 WatchSource:0}: Error finding container fe27b48ea894a19fc652e4f7b9d1a5789cc3ceb05156c750d2d6551827f507a5: Status 404 returned error can't find the container with id fe27b48ea894a19fc652e4f7b9d1a5789cc3ceb05156c750d2d6551827f507a5 Apr 24 21:16:02.703924 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:02.703897 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7dffc39_699b_4872_afc1_cfff9e51ea9d.slice/crio-2687b182bd9ea368061f338cd12f35c9342304d374b7a8eb621697747ca45266 WatchSource:0}: Error finding container 2687b182bd9ea368061f338cd12f35c9342304d374b7a8eb621697747ca45266: Status 404 returned error can't find the container with id 2687b182bd9ea368061f338cd12f35c9342304d374b7a8eb621697747ca45266 Apr 24 21:16:02.704696 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:02.704672 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5073b991_08af_46af_8888_2f0923d2dfac.slice/crio-0b59793eb7e48c626251f624db7f3bee827267ab0c9516af8224f2051a366b5c WatchSource:0}: Error finding container 0b59793eb7e48c626251f624db7f3bee827267ab0c9516af8224f2051a366b5c: Status 404 returned error can't find the container with id 0b59793eb7e48c626251f624db7f3bee827267ab0c9516af8224f2051a366b5c Apr 24 21:16:02.706072 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:02.705639 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc61fee18_e272_4bf5_aa08_65392bba68b6.slice/crio-f1d5a28fa58a1d5ffee68fceb5093922e08e9692ffdc8db91bff55bb1efc4d91 WatchSource:0}: Error finding container f1d5a28fa58a1d5ffee68fceb5093922e08e9692ffdc8db91bff55bb1efc4d91: Status 404 returned error can't find the container with id f1d5a28fa58a1d5ffee68fceb5093922e08e9692ffdc8db91bff55bb1efc4d91 Apr 24 21:16:02.706737 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:02.706663 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaff945d8_1222_4254_8b87_2cd6e5517284.slice/crio-8d2bb2a98d64816b4fff2003aa523910681e38cce6e28cfc35650a8542816924 WatchSource:0}: Error finding container 8d2bb2a98d64816b4fff2003aa523910681e38cce6e28cfc35650a8542816924: Status 404 returned error can't find the container with id 8d2bb2a98d64816b4fff2003aa523910681e38cce6e28cfc35650a8542816924 Apr 24 21:16:02.826847 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.826815 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:11:00 +0000 UTC" deadline="2027-09-29 14:08:14.853305911 +0000 UTC" Apr 24 21:16:02.826847 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.826844 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12544h52m12.026464302s" Apr 24 21:16:02.869198 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.869178 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:02.869310 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:02.869269 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5shjj" podUID="9018a4db-1967-45ae-8ad3-7fdd04d6a4d1" Apr 24 21:16:02.876960 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.876936 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kgjxk" event={"ID":"aff945d8-1222-4254-8b87-2cd6e5517284","Type":"ContainerStarted","Data":"8d2bb2a98d64816b4fff2003aa523910681e38cce6e28cfc35650a8542816924"} Apr 24 21:16:02.877914 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.877892 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5shvq" event={"ID":"c61fee18-e272-4bf5-aa08-65392bba68b6","Type":"ContainerStarted","Data":"f1d5a28fa58a1d5ffee68fceb5093922e08e9692ffdc8db91bff55bb1efc4d91"} Apr 24 21:16:02.878878 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.878856 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" event={"ID":"d7dffc39-699b-4872-afc1-cfff9e51ea9d","Type":"ContainerStarted","Data":"2687b182bd9ea368061f338cd12f35c9342304d374b7a8eb621697747ca45266"} Apr 24 21:16:02.879989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.879968 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" event={"ID":"56d7cab8-8a8d-47a6-81da-f1f67f4aed59","Type":"ContainerStarted","Data":"fe27b48ea894a19fc652e4f7b9d1a5789cc3ceb05156c750d2d6551827f507a5"} Apr 24 21:16:02.880893 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.880875 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5pvk7" event={"ID":"92521ad1-7ba9-4bdd-bc3b-f470cd17cfef","Type":"ContainerStarted","Data":"d83eb7b4d00ee15e2dc7112dc0aefc1df74a17c5cb63912af881225a6684507a"} Apr 24 21:16:02.882235 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.882199 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-15.ec2.internal" event={"ID":"099fd95062106834f37953dba57d38ac","Type":"ContainerStarted","Data":"ff706b742e8a4f40c3bb7189ab98d9e28a58ba6fa97e27f38a5b310ae036bc62"} Apr 24 21:16:02.883211 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.883191 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bl44b" event={"ID":"5073b991-08af-46af-8888-2f0923d2dfac","Type":"ContainerStarted","Data":"0b59793eb7e48c626251f624db7f3bee827267ab0c9516af8224f2051a366b5c"} Apr 24 21:16:02.884085 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.884064 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9rv78" event={"ID":"ff16a19a-c677-4d51-81d3-8d67d7ce1749","Type":"ContainerStarted","Data":"c759ffa468c68d80589a71267fdfe760ba9dd3fd3995b3142b7fc51447009a7b"} Apr 24 21:16:02.884952 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.884935 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nwpfl" event={"ID":"801b2a6e-b16d-4e63-b007-af7d6c9273f5","Type":"ContainerStarted","Data":"be2cbf27d55b90aefc368393f30229d69a81cb176d6b53889607515e31f37919"} Apr 24 21:16:02.885812 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.885783 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln8q" event={"ID":"a3d5446f-bdf9-4989-9626-7608937edfa8","Type":"ContainerStarted","Data":"bbba660776bc26b3dce72e96296f0e7495f4ffcd2bdcd0f3894ba3975d2e42e5"} Apr 24 21:16:02.898075 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:02.896719 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-15.ec2.internal" podStartSLOduration=1.896704029 podStartE2EDuration="1.896704029s" podCreationTimestamp="2026-04-24 21:16:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:16:02.895998966 +0000 UTC m=+3.574835638" watchObservedRunningTime="2026-04-24 21:16:02.896704029 +0000 UTC m=+3.575540704" Apr 24 21:16:03.495578 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:03.494745 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs\") pod \"network-metrics-daemon-n487x\" (UID: \"657a2c9b-4e75-4d61-bff2-d8abdd05825d\") " pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:03.495578 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:03.494816 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hb75q\" (UniqueName: \"kubernetes.io/projected/9018a4db-1967-45ae-8ad3-7fdd04d6a4d1-kube-api-access-hb75q\") pod \"network-check-target-5shjj\" (UID: \"9018a4db-1967-45ae-8ad3-7fdd04d6a4d1\") " pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:03.495578 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:03.494971 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:03.495578 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:03.494989 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:03.495578 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:03.495003 2581 projected.go:194] Error preparing data for projected volume kube-api-access-hb75q for pod openshift-network-diagnostics/network-check-target-5shjj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:03.495578 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:03.495059 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9018a4db-1967-45ae-8ad3-7fdd04d6a4d1-kube-api-access-hb75q podName:9018a4db-1967-45ae-8ad3-7fdd04d6a4d1 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:05.495041725 +0000 UTC m=+6.173878381 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-hb75q" (UniqueName: "kubernetes.io/projected/9018a4db-1967-45ae-8ad3-7fdd04d6a4d1-kube-api-access-hb75q") pod "network-check-target-5shjj" (UID: "9018a4db-1967-45ae-8ad3-7fdd04d6a4d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:03.495578 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:03.495491 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:03.495578 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:03.495541 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs podName:657a2c9b-4e75-4d61-bff2-d8abdd05825d nodeName:}" failed. No retries permitted until 2026-04-24 21:16:05.49552576 +0000 UTC m=+6.174362416 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs") pod "network-metrics-daemon-n487x" (UID: "657a2c9b-4e75-4d61-bff2-d8abdd05825d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:03.869727 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:03.869198 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:03.869727 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:03.869338 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n487x" podUID="657a2c9b-4e75-4d61-bff2-d8abdd05825d" Apr 24 21:16:03.917499 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:03.917466 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-15.ec2.internal" event={"ID":"b91d10f43033d8a8aebe3d36b0cb37a8","Type":"ContainerStarted","Data":"98f9dee81b396a2865600f08ae6fa15f003d676d0b506651f96e8ad6c58c4e6b"} Apr 24 21:16:04.869233 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:04.869202 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:04.869397 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:04.869325 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5shjj" podUID="9018a4db-1967-45ae-8ad3-7fdd04d6a4d1" Apr 24 21:16:04.933872 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:04.933836 2581 generic.go:358] "Generic (PLEG): container finished" podID="b91d10f43033d8a8aebe3d36b0cb37a8" containerID="98f9dee81b396a2865600f08ae6fa15f003d676d0b506651f96e8ad6c58c4e6b" exitCode=0 Apr 24 21:16:04.934313 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:04.933890 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-15.ec2.internal" event={"ID":"b91d10f43033d8a8aebe3d36b0cb37a8","Type":"ContainerDied","Data":"98f9dee81b396a2865600f08ae6fa15f003d676d0b506651f96e8ad6c58c4e6b"} Apr 24 21:16:04.934313 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:04.933918 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-15.ec2.internal" event={"ID":"b91d10f43033d8a8aebe3d36b0cb37a8","Type":"ContainerStarted","Data":"095b152ce1e3c27a93b07581ddd1149c2a9f45119fa3b82273d6cc9dd94d786d"} Apr 24 21:16:05.511849 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:05.510998 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hb75q\" (UniqueName: \"kubernetes.io/projected/9018a4db-1967-45ae-8ad3-7fdd04d6a4d1-kube-api-access-hb75q\") pod \"network-check-target-5shjj\" (UID: \"9018a4db-1967-45ae-8ad3-7fdd04d6a4d1\") " pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:05.511849 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:05.511078 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs\") pod \"network-metrics-daemon-n487x\" (UID: \"657a2c9b-4e75-4d61-bff2-d8abdd05825d\") " pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:05.511849 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:05.511199 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:05.511849 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:05.511260 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs podName:657a2c9b-4e75-4d61-bff2-d8abdd05825d nodeName:}" failed. No retries permitted until 2026-04-24 21:16:09.511242584 +0000 UTC m=+10.190079235 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs") pod "network-metrics-daemon-n487x" (UID: "657a2c9b-4e75-4d61-bff2-d8abdd05825d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:05.511849 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:05.511691 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:05.511849 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:05.511709 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:05.511849 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:05.511722 2581 projected.go:194] Error preparing data for projected volume kube-api-access-hb75q for pod openshift-network-diagnostics/network-check-target-5shjj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:05.511849 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:05.511767 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9018a4db-1967-45ae-8ad3-7fdd04d6a4d1-kube-api-access-hb75q podName:9018a4db-1967-45ae-8ad3-7fdd04d6a4d1 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:09.511752538 +0000 UTC m=+10.190589195 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-hb75q" (UniqueName: "kubernetes.io/projected/9018a4db-1967-45ae-8ad3-7fdd04d6a4d1-kube-api-access-hb75q") pod "network-check-target-5shjj" (UID: "9018a4db-1967-45ae-8ad3-7fdd04d6a4d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:05.870064 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:05.869784 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:05.870064 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:05.869936 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n487x" podUID="657a2c9b-4e75-4d61-bff2-d8abdd05825d" Apr 24 21:16:06.869170 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:06.868984 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:06.869170 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:06.869107 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5shjj" podUID="9018a4db-1967-45ae-8ad3-7fdd04d6a4d1" Apr 24 21:16:07.869181 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:07.869153 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:07.869655 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:07.869287 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n487x" podUID="657a2c9b-4e75-4d61-bff2-d8abdd05825d" Apr 24 21:16:08.869621 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:08.869579 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:08.870068 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:08.869703 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5shjj" podUID="9018a4db-1967-45ae-8ad3-7fdd04d6a4d1" Apr 24 21:16:09.546746 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:09.546670 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs\") pod \"network-metrics-daemon-n487x\" (UID: \"657a2c9b-4e75-4d61-bff2-d8abdd05825d\") " pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:09.546746 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:09.546732 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hb75q\" (UniqueName: \"kubernetes.io/projected/9018a4db-1967-45ae-8ad3-7fdd04d6a4d1-kube-api-access-hb75q\") pod \"network-check-target-5shjj\" (UID: \"9018a4db-1967-45ae-8ad3-7fdd04d6a4d1\") " pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:09.547009 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:09.546849 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:09.547009 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:09.546874 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:09.547009 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:09.546890 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:09.547009 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:09.546903 2581 projected.go:194] Error preparing data for projected volume kube-api-access-hb75q for pod openshift-network-diagnostics/network-check-target-5shjj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:09.547009 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:09.546932 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs podName:657a2c9b-4e75-4d61-bff2-d8abdd05825d nodeName:}" failed. No retries permitted until 2026-04-24 21:16:17.546911216 +0000 UTC m=+18.225747866 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs") pod "network-metrics-daemon-n487x" (UID: "657a2c9b-4e75-4d61-bff2-d8abdd05825d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:09.547009 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:09.546950 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9018a4db-1967-45ae-8ad3-7fdd04d6a4d1-kube-api-access-hb75q podName:9018a4db-1967-45ae-8ad3-7fdd04d6a4d1 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:17.546943088 +0000 UTC m=+18.225779737 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-hb75q" (UniqueName: "kubernetes.io/projected/9018a4db-1967-45ae-8ad3-7fdd04d6a4d1-kube-api-access-hb75q") pod "network-check-target-5shjj" (UID: "9018a4db-1967-45ae-8ad3-7fdd04d6a4d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:09.870067 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:09.869992 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:09.870526 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:09.870118 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n487x" podUID="657a2c9b-4e75-4d61-bff2-d8abdd05825d" Apr 24 21:16:10.870076 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:10.869736 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:10.870076 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:10.870072 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5shjj" podUID="9018a4db-1967-45ae-8ad3-7fdd04d6a4d1" Apr 24 21:16:11.869550 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:11.869522 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:11.869704 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:11.869634 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n487x" podUID="657a2c9b-4e75-4d61-bff2-d8abdd05825d" Apr 24 21:16:12.869134 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:12.869098 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:12.869546 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:12.869197 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5shjj" podUID="9018a4db-1967-45ae-8ad3-7fdd04d6a4d1" Apr 24 21:16:13.869645 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:13.869610 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:13.870061 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:13.869758 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n487x" podUID="657a2c9b-4e75-4d61-bff2-d8abdd05825d" Apr 24 21:16:14.869247 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:14.869214 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:14.869393 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:14.869321 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5shjj" podUID="9018a4db-1967-45ae-8ad3-7fdd04d6a4d1" Apr 24 21:16:15.869728 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:15.869701 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:15.870160 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:15.869828 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n487x" podUID="657a2c9b-4e75-4d61-bff2-d8abdd05825d" Apr 24 21:16:16.868969 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:16.868936 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:16.869147 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:16.869057 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5shjj" podUID="9018a4db-1967-45ae-8ad3-7fdd04d6a4d1" Apr 24 21:16:17.606734 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:17.606683 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs\") pod \"network-metrics-daemon-n487x\" (UID: \"657a2c9b-4e75-4d61-bff2-d8abdd05825d\") " pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:17.607152 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:17.606750 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hb75q\" (UniqueName: \"kubernetes.io/projected/9018a4db-1967-45ae-8ad3-7fdd04d6a4d1-kube-api-access-hb75q\") pod \"network-check-target-5shjj\" (UID: \"9018a4db-1967-45ae-8ad3-7fdd04d6a4d1\") " pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:17.607152 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:17.606865 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:17.607152 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:17.606890 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:17.607152 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:17.606906 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:17.607152 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:17.606919 2581 projected.go:194] Error preparing data for projected volume kube-api-access-hb75q for pod openshift-network-diagnostics/network-check-target-5shjj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:17.607152 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:17.606949 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs podName:657a2c9b-4e75-4d61-bff2-d8abdd05825d nodeName:}" failed. No retries permitted until 2026-04-24 21:16:33.606928482 +0000 UTC m=+34.285765140 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs") pod "network-metrics-daemon-n487x" (UID: "657a2c9b-4e75-4d61-bff2-d8abdd05825d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:17.607152 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:17.606971 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9018a4db-1967-45ae-8ad3-7fdd04d6a4d1-kube-api-access-hb75q podName:9018a4db-1967-45ae-8ad3-7fdd04d6a4d1 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:33.606960202 +0000 UTC m=+34.285796858 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-hb75q" (UniqueName: "kubernetes.io/projected/9018a4db-1967-45ae-8ad3-7fdd04d6a4d1-kube-api-access-hb75q") pod "network-check-target-5shjj" (UID: "9018a4db-1967-45ae-8ad3-7fdd04d6a4d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:17.869097 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:17.869016 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:17.869260 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:17.869190 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n487x" podUID="657a2c9b-4e75-4d61-bff2-d8abdd05825d" Apr 24 21:16:18.869499 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:18.869472 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:18.869959 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:18.869591 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5shjj" podUID="9018a4db-1967-45ae-8ad3-7fdd04d6a4d1" Apr 24 21:16:19.870303 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:19.870281 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:19.870633 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:19.870389 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n487x" podUID="657a2c9b-4e75-4d61-bff2-d8abdd05825d" Apr 24 21:16:20.869312 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:20.869061 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:20.869531 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:20.869380 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5shjj" podUID="9018a4db-1967-45ae-8ad3-7fdd04d6a4d1" Apr 24 21:16:20.963101 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:20.963066 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" event={"ID":"d7dffc39-699b-4872-afc1-cfff9e51ea9d","Type":"ContainerStarted","Data":"e2a0b7bd0d345d559104e78fd4074d82bb76ad0f41ebd5046fd5c8053a519df7"} Apr 24 21:16:20.965267 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:20.965245 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 21:16:20.965512 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:20.965491 2581 generic.go:358] "Generic (PLEG): container finished" podID="56d7cab8-8a8d-47a6-81da-f1f67f4aed59" containerID="666a3bda022e13fdbcfa1a7f5155258eb08b40265b1115fe76fb2d0ba410e099" exitCode=1 Apr 24 21:16:20.965612 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:20.965564 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" event={"ID":"56d7cab8-8a8d-47a6-81da-f1f67f4aed59","Type":"ContainerStarted","Data":"b658710f1d96708304b04f84702b7cc82c265232335238d47552a5d1e5954dd7"} Apr 24 21:16:20.965612 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:20.965594 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" event={"ID":"56d7cab8-8a8d-47a6-81da-f1f67f4aed59","Type":"ContainerStarted","Data":"26ecc03b80a37a8afd2660fa30c09cf96a9adcab3277d69fcfbd44e4e596a57c"} Apr 24 21:16:20.965692 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:20.965618 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" event={"ID":"56d7cab8-8a8d-47a6-81da-f1f67f4aed59","Type":"ContainerStarted","Data":"216a9cc603ce53fb92560ca2b2eb310fdd51ed6c0fbdb940ee2d486b28633423"} Apr 24 21:16:20.965692 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:20.965634 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" event={"ID":"56d7cab8-8a8d-47a6-81da-f1f67f4aed59","Type":"ContainerStarted","Data":"b3590eedf5f5b5dd45820cffc07a3f0bb24f793f23d32f96029411026e2d628a"} Apr 24 21:16:20.965692 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:20.965645 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" event={"ID":"56d7cab8-8a8d-47a6-81da-f1f67f4aed59","Type":"ContainerDied","Data":"666a3bda022e13fdbcfa1a7f5155258eb08b40265b1115fe76fb2d0ba410e099"} Apr 24 21:16:20.965692 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:20.965661 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" event={"ID":"56d7cab8-8a8d-47a6-81da-f1f67f4aed59","Type":"ContainerStarted","Data":"d5cfed27b0a9da9fb97df0c2514fe3870a5ec887e78065c1e4a72d5fd306728e"} Apr 24 21:16:20.966743 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:20.966721 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5pvk7" event={"ID":"92521ad1-7ba9-4bdd-bc3b-f470cd17cfef","Type":"ContainerStarted","Data":"90979f17205f7024c0e366251eb51c86db0607ffac152cd4af3a3aea3934ab38"} Apr 24 21:16:20.967873 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:20.967854 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bl44b" event={"ID":"5073b991-08af-46af-8888-2f0923d2dfac","Type":"ContainerStarted","Data":"7521c47746522f98800c522c85faf3ec6b3b7e6e59eac164e644e2c80802cbad"} Apr 24 21:16:20.969035 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:20.969017 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9rv78" event={"ID":"ff16a19a-c677-4d51-81d3-8d67d7ce1749","Type":"ContainerStarted","Data":"e332b93136db123a2ba6d25715a6d2203bf479d0ccaf42c8abde1978f9a21b30"} Apr 24 21:16:20.970125 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:20.970104 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nwpfl" event={"ID":"801b2a6e-b16d-4e63-b007-af7d6c9273f5","Type":"ContainerStarted","Data":"02c954c5320b344901df7a16a6979d404c9804468c95b6d0f0c180ae63bc7b27"} Apr 24 21:16:20.971202 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:20.971181 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kgjxk" event={"ID":"aff945d8-1222-4254-8b87-2cd6e5517284","Type":"ContainerStarted","Data":"c2afa3a6e805749b4aaa3ba38487a301b04f1f87c10d8535a8fd475883ee7cd4"} Apr 24 21:16:20.972387 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:20.972368 2581 generic.go:358] "Generic (PLEG): container finished" podID="c61fee18-e272-4bf5-aa08-65392bba68b6" containerID="a0eae4cbcc2996c24af82781748c3ac799b37d1621430aafcf465047e11f0764" exitCode=0 Apr 24 21:16:20.972473 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:20.972395 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5shvq" event={"ID":"c61fee18-e272-4bf5-aa08-65392bba68b6","Type":"ContainerDied","Data":"a0eae4cbcc2996c24af82781748c3ac799b37d1621430aafcf465047e11f0764"} Apr 24 21:16:20.978313 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:20.978275 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-15.ec2.internal" podStartSLOduration=19.978249558999998 podStartE2EDuration="19.978249559s" podCreationTimestamp="2026-04-24 21:16:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:16:04.948193698 +0000 UTC m=+5.627030375" watchObservedRunningTime="2026-04-24 21:16:20.978249559 +0000 UTC m=+21.657086225" Apr 24 21:16:20.978782 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:20.978756 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5pvk7" podStartSLOduration=3.861566783 podStartE2EDuration="20.97874935s" podCreationTimestamp="2026-04-24 21:16:00 +0000 UTC" firstStartedPulling="2026-04-24 21:16:02.699697338 +0000 UTC m=+3.378533989" lastFinishedPulling="2026-04-24 21:16:19.816879892 +0000 UTC m=+20.495716556" observedRunningTime="2026-04-24 21:16:20.978333263 +0000 UTC m=+21.657169935" watchObservedRunningTime="2026-04-24 21:16:20.97874935 +0000 UTC m=+21.657586021" Apr 24 21:16:20.991476 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:20.991446 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9rv78" podStartSLOduration=3.8761877179999997 podStartE2EDuration="20.991437067s" podCreationTimestamp="2026-04-24 21:16:00 +0000 UTC" firstStartedPulling="2026-04-24 21:16:02.703141367 +0000 UTC m=+3.381978028" lastFinishedPulling="2026-04-24 21:16:19.818390713 +0000 UTC m=+20.497227377" observedRunningTime="2026-04-24 21:16:20.991147062 +0000 UTC m=+21.669983734" watchObservedRunningTime="2026-04-24 21:16:20.991437067 +0000 UTC m=+21.670273730" Apr 24 21:16:21.024109 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:21.024057 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bl44b" podStartSLOduration=4.939478244 podStartE2EDuration="22.024046149s" podCreationTimestamp="2026-04-24 21:15:59 +0000 UTC" firstStartedPulling="2026-04-24 21:16:02.706750999 +0000 UTC m=+3.385587652" lastFinishedPulling="2026-04-24 21:16:19.791318897 +0000 UTC m=+20.470155557" observedRunningTime="2026-04-24 21:16:21.023833302 +0000 UTC m=+21.702669974" watchObservedRunningTime="2026-04-24 21:16:21.024046149 +0000 UTC m=+21.702882820" Apr 24 21:16:21.036606 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:21.036461 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-nwpfl" podStartSLOduration=4.923195646 podStartE2EDuration="22.036452165s" podCreationTimestamp="2026-04-24 21:15:59 +0000 UTC" firstStartedPulling="2026-04-24 21:16:02.702214824 +0000 UTC m=+3.381051475" lastFinishedPulling="2026-04-24 21:16:19.815471341 +0000 UTC m=+20.494307994" observedRunningTime="2026-04-24 21:16:21.036280402 +0000 UTC m=+21.715117075" watchObservedRunningTime="2026-04-24 21:16:21.036452165 +0000 UTC m=+21.715288834" Apr 24 21:16:21.057322 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:21.057289 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-kgjxk" podStartSLOduration=3.9990101989999998 podStartE2EDuration="21.057280989s" podCreationTimestamp="2026-04-24 21:16:00 +0000 UTC" firstStartedPulling="2026-04-24 21:16:02.709054314 +0000 UTC m=+3.387890973" lastFinishedPulling="2026-04-24 21:16:19.7673251 +0000 UTC m=+20.446161763" observedRunningTime="2026-04-24 21:16:21.057074707 +0000 UTC m=+21.735911378" watchObservedRunningTime="2026-04-24 21:16:21.057280989 +0000 UTC m=+21.736117660" Apr 24 21:16:21.411503 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:21.411479 2581 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:16:21.839904 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:21.839816 2581 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:16:21.411499024Z","UUID":"7e5c6b11-6fa9-45e1-9ad8-4de8fd61a6c6","Handler":null,"Name":"","Endpoint":""} Apr 24 21:16:21.843276 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:21.843242 2581 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:16:21.843276 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:21.843271 2581 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:16:21.868889 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:21.868860 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:21.869039 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:21.868995 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n487x" podUID="657a2c9b-4e75-4d61-bff2-d8abdd05825d" Apr 24 21:16:21.975893 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:21.975859 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln8q" event={"ID":"a3d5446f-bdf9-4989-9626-7608937edfa8","Type":"ContainerStarted","Data":"bf35344bf50fadfed3068358daa1e7bd084381f8959a7679d0b9293a16c8abe7"} Apr 24 21:16:21.977783 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:21.977759 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" event={"ID":"d7dffc39-699b-4872-afc1-cfff9e51ea9d","Type":"ContainerStarted","Data":"2742652fd4dd05662b9660c44d7e5f8e2144b9e767ca5e98c532cce8adc214f3"} Apr 24 21:16:21.998656 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:21.998598 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4ln8q" podStartSLOduration=4.87250955 podStartE2EDuration="21.998582339s" podCreationTimestamp="2026-04-24 21:16:00 +0000 UTC" firstStartedPulling="2026-04-24 21:16:02.697593423 +0000 UTC m=+3.376430078" lastFinishedPulling="2026-04-24 21:16:19.823666213 +0000 UTC m=+20.502502867" observedRunningTime="2026-04-24 21:16:21.99785962 +0000 UTC m=+22.676696293" watchObservedRunningTime="2026-04-24 21:16:21.998582339 +0000 UTC m=+22.677419015" Apr 24 21:16:22.869588 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:22.869566 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:22.869689 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:22.869671 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5shjj" podUID="9018a4db-1967-45ae-8ad3-7fdd04d6a4d1" Apr 24 21:16:22.898144 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:22.898078 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-kgjxk" Apr 24 21:16:22.898787 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:22.898766 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-kgjxk" Apr 24 21:16:22.981616 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:22.981583 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" event={"ID":"d7dffc39-699b-4872-afc1-cfff9e51ea9d","Type":"ContainerStarted","Data":"87ff242aeec893dcd9062fd8310d2cfcc9c07ca302ea3769babafd8bbc53516b"} Apr 24 21:16:22.984542 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:22.984524 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 21:16:22.984942 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:22.984915 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" event={"ID":"56d7cab8-8a8d-47a6-81da-f1f67f4aed59","Type":"ContainerStarted","Data":"e35fb29fe6c1d1eb134005acc1b06ecc4d2aa6b87119113bab7f9d382655c7fa"} Apr 24 21:16:22.985363 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:22.985327 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-kgjxk" Apr 24 21:16:22.985638 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:22.985621 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-kgjxk" Apr 24 21:16:23.007611 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:23.007561 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gxrpg" podStartSLOduration=3.064966348 podStartE2EDuration="23.007544053s" podCreationTimestamp="2026-04-24 21:16:00 +0000 UTC" firstStartedPulling="2026-04-24 21:16:02.70762491 +0000 UTC m=+3.386461567" lastFinishedPulling="2026-04-24 21:16:22.650202613 +0000 UTC m=+23.329039272" observedRunningTime="2026-04-24 21:16:23.007382652 +0000 UTC m=+23.686219324" watchObservedRunningTime="2026-04-24 21:16:23.007544053 +0000 UTC m=+23.686380725" Apr 24 21:16:23.869143 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:23.869107 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:23.869308 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:23.869257 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n487x" podUID="657a2c9b-4e75-4d61-bff2-d8abdd05825d" Apr 24 21:16:24.869072 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:24.869047 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:24.869640 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:24.869151 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5shjj" podUID="9018a4db-1967-45ae-8ad3-7fdd04d6a4d1" Apr 24 21:16:25.869295 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:25.869134 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:25.869891 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:25.869390 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n487x" podUID="657a2c9b-4e75-4d61-bff2-d8abdd05825d" Apr 24 21:16:25.992473 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:25.992438 2581 generic.go:358] "Generic (PLEG): container finished" podID="c61fee18-e272-4bf5-aa08-65392bba68b6" containerID="f134e9f62b7159c3ce4ed2bbe96fe32fc11c649802606a9df671082280abdacf" exitCode=0 Apr 24 21:16:25.992608 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:25.992517 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5shvq" event={"ID":"c61fee18-e272-4bf5-aa08-65392bba68b6","Type":"ContainerDied","Data":"f134e9f62b7159c3ce4ed2bbe96fe32fc11c649802606a9df671082280abdacf"} Apr 24 21:16:25.997663 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:25.997645 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 21:16:25.998003 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:25.997981 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" event={"ID":"56d7cab8-8a8d-47a6-81da-f1f67f4aed59","Type":"ContainerStarted","Data":"f122b16e9dc45d9b3ac5d42d4da63c7f6271d30b8e9371a7ddb63ccf1b63ccaf"} Apr 24 21:16:25.998279 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:25.998257 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:25.998374 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:25.998284 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:25.998412 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:25.998372 2581 scope.go:117] "RemoveContainer" containerID="666a3bda022e13fdbcfa1a7f5155258eb08b40265b1115fe76fb2d0ba410e099" Apr 24 21:16:26.014487 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:26.014468 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:26.869146 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:26.869113 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:26.869296 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:26.869232 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5shjj" podUID="9018a4db-1967-45ae-8ad3-7fdd04d6a4d1" Apr 24 21:16:27.003560 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:27.003537 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 21:16:27.003958 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:27.003923 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" event={"ID":"56d7cab8-8a8d-47a6-81da-f1f67f4aed59","Type":"ContainerStarted","Data":"010de5ec613f432bbadc2ab4cbefb2035c8e5130468c7bf1e465c85f1038fbb1"} Apr 24 21:16:27.004223 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:27.004193 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:27.006308 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:27.006278 2581 generic.go:358] "Generic (PLEG): container finished" podID="c61fee18-e272-4bf5-aa08-65392bba68b6" containerID="950c3edc059084d37e314fc30097dface0064a1d79b7e85568fcc91ac7ae2259" exitCode=0 Apr 24 21:16:27.006478 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:27.006318 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5shvq" event={"ID":"c61fee18-e272-4bf5-aa08-65392bba68b6","Type":"ContainerDied","Data":"950c3edc059084d37e314fc30097dface0064a1d79b7e85568fcc91ac7ae2259"} Apr 24 21:16:27.019857 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:27.019839 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:16:27.031366 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:27.031315 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" podStartSLOduration=9.755219894 podStartE2EDuration="27.031299255s" podCreationTimestamp="2026-04-24 21:16:00 +0000 UTC" firstStartedPulling="2026-04-24 21:16:02.705645139 +0000 UTC m=+3.384481795" lastFinishedPulling="2026-04-24 21:16:19.981724492 +0000 UTC m=+20.660561156" observedRunningTime="2026-04-24 21:16:27.02993865 +0000 UTC m=+27.708775322" watchObservedRunningTime="2026-04-24 21:16:27.031299255 +0000 UTC m=+27.710135928" Apr 24 21:16:27.284940 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:27.284903 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5shjj"] Apr 24 21:16:27.285114 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:27.285013 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:27.285114 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:27.285090 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5shjj" podUID="9018a4db-1967-45ae-8ad3-7fdd04d6a4d1" Apr 24 21:16:27.287817 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:27.287798 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n487x"] Apr 24 21:16:27.287906 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:27.287889 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:27.287979 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:27.287961 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n487x" podUID="657a2c9b-4e75-4d61-bff2-d8abdd05825d" Apr 24 21:16:28.012810 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:28.012778 2581 generic.go:358] "Generic (PLEG): container finished" podID="c61fee18-e272-4bf5-aa08-65392bba68b6" containerID="494c92cb8f7adbb1f6e60fcdabb03f8e3bd1d2e4699d4f8c260cd9ac760d09fe" exitCode=0 Apr 24 21:16:28.013200 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:28.012862 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5shvq" event={"ID":"c61fee18-e272-4bf5-aa08-65392bba68b6","Type":"ContainerDied","Data":"494c92cb8f7adbb1f6e60fcdabb03f8e3bd1d2e4699d4f8c260cd9ac760d09fe"} Apr 24 21:16:28.869517 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:28.869486 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:28.869707 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:28.869492 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:28.869707 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:28.869598 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n487x" podUID="657a2c9b-4e75-4d61-bff2-d8abdd05825d" Apr 24 21:16:28.869707 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:28.869658 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5shjj" podUID="9018a4db-1967-45ae-8ad3-7fdd04d6a4d1" Apr 24 21:16:30.869339 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:30.869299 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:30.870028 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:30.869299 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:30.870028 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:30.869463 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n487x" podUID="657a2c9b-4e75-4d61-bff2-d8abdd05825d" Apr 24 21:16:30.870028 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:30.869530 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5shjj" podUID="9018a4db-1967-45ae-8ad3-7fdd04d6a4d1" Apr 24 21:16:32.869254 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:32.869224 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:32.869838 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:32.869224 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:32.869838 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:32.869367 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n487x" podUID="657a2c9b-4e75-4d61-bff2-d8abdd05825d" Apr 24 21:16:32.869838 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:32.869417 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5shjj" podUID="9018a4db-1967-45ae-8ad3-7fdd04d6a4d1" Apr 24 21:16:33.153190 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.153105 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-15.ec2.internal" event="NodeReady" Apr 24 21:16:33.153347 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.153273 2581 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:16:33.190692 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.190663 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58"] Apr 24 21:16:33.219765 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.219731 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-c7lrn"] Apr 24 21:16:33.220072 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.219874 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" Apr 24 21:16:33.223536 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.223056 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 21:16:33.223536 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.223106 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 21:16:33.223536 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.223146 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 21:16:33.223536 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.223261 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 21:16:33.223536 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.223359 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 21:16:33.223871 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.223591 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 21:16:33.223871 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.223599 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 21:16:33.242489 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.242465 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-499sv"] Apr 24 21:16:33.242652 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.242633 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" Apr 24 21:16:33.244794 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.244773 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 21:16:33.244894 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.244803 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 21:16:33.244894 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.244812 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:33.244894 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.244850 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-t66bl\"" Apr 24 21:16:33.245202 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.245183 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:33.251984 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.251965 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 21:16:33.262981 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.262963 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-k25j5"] Apr 24 21:16:33.263130 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.263101 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-499sv" Apr 24 21:16:33.265234 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.265213 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-tjpsp\"" Apr 24 21:16:33.265329 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.265248 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 21:16:33.265375 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.265341 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 21:16:33.282143 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.282125 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667bf47b7-wq8b5"] Apr 24 21:16:33.282269 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.282253 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-k25j5" Apr 24 21:16:33.287773 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.287737 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 21:16:33.287910 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.287896 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:33.287968 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.287945 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-7j7ck\"" Apr 24 21:16:33.288132 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.288029 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 21:16:33.288772 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.288753 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:33.300406 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.300384 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-xfzqb"] Apr 24 21:16:33.300528 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.300510 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667bf47b7-wq8b5" Apr 24 21:16:33.302902 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.302883 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 21:16:33.303113 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.303093 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-4ncz7\"" Apr 24 21:16:33.318388 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.318366 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fvtdb"] Apr 24 21:16:33.318538 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.318517 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-xfzqb" Apr 24 21:16:33.322526 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.322392 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 21:16:33.322639 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.322608 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 21:16:33.323054 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.323032 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:16:33.323374 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.323357 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-pgxnx\"" Apr 24 21:16:33.323676 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.323657 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:16:33.332279 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.332256 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d799708d-6592-4222-b0e7-a25a20dc584e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58\" (UID: \"d799708d-6592-4222-b0e7-a25a20dc584e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" Apr 24 21:16:33.332387 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.332292 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d799708d-6592-4222-b0e7-a25a20dc584e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58\" (UID: \"d799708d-6592-4222-b0e7-a25a20dc584e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" Apr 24 21:16:33.332387 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.332325 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d799708d-6592-4222-b0e7-a25a20dc584e-hub\") pod \"cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58\" (UID: \"d799708d-6592-4222-b0e7-a25a20dc584e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" Apr 24 21:16:33.332387 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.332378 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fszb\" (UniqueName: \"kubernetes.io/projected/d799708d-6592-4222-b0e7-a25a20dc584e-kube-api-access-6fszb\") pod \"cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58\" (UID: \"d799708d-6592-4222-b0e7-a25a20dc584e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" Apr 24 21:16:33.332577 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.332407 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d799708d-6592-4222-b0e7-a25a20dc584e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58\" (UID: \"d799708d-6592-4222-b0e7-a25a20dc584e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" Apr 24 21:16:33.332577 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.332460 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d799708d-6592-4222-b0e7-a25a20dc584e-ca\") pod \"cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58\" (UID: \"d799708d-6592-4222-b0e7-a25a20dc584e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" Apr 24 21:16:33.342342 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.342320 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z962g"] Apr 24 21:16:33.342449 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.342352 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 21:16:33.342516 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.342461 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fvtdb" Apr 24 21:16:33.344584 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.344560 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-zbvcb\"" Apr 24 21:16:33.345003 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.344985 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:33.345076 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.345026 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:33.363545 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.363521 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-68586bbdd8-8kw45"] Apr 24 21:16:33.363680 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.363665 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z962g" Apr 24 21:16:33.365688 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.365670 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:33.365784 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.365701 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 21:16:33.365849 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.365802 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-vhrwx\"" Apr 24 21:16:33.365948 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.365933 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 21:16:33.366223 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.366114 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:33.384566 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.384546 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-kc9q6"] Apr 24 21:16:33.384794 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.384775 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.387149 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.387129 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:16:33.387564 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.387440 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:16:33.387564 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.387452 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:16:33.387564 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.387479 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fbr4t\"" Apr 24 21:16:33.393097 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.393080 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:16:33.406332 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.406280 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6c69778f6c-6k4sq"] Apr 24 21:16:33.406474 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.406447 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kc9q6" Apr 24 21:16:33.410714 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.409386 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-gq295\"" Apr 24 21:16:33.410714 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.409636 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:16:33.410714 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.409908 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:16:33.430417 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.430396 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7q4x"] Apr 24 21:16:33.430585 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.430568 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.433048 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.433022 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d799708d-6592-4222-b0e7-a25a20dc584e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58\" (UID: \"d799708d-6592-4222-b0e7-a25a20dc584e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" Apr 24 21:16:33.433145 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.433068 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d799708d-6592-4222-b0e7-a25a20dc584e-ca\") pod \"cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58\" (UID: \"d799708d-6592-4222-b0e7-a25a20dc584e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" Apr 24 21:16:33.433145 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.433103 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1b7bcd1-e58f-42c3-9a78-a06df4ff2253-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-xfzqb\" (UID: \"d1b7bcd1-e58f-42c3-9a78-a06df4ff2253\") " pod="openshift-insights/insights-operator-585dfdc468-xfzqb" Apr 24 21:16:33.433145 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.433128 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1b7bcd1-e58f-42c3-9a78-a06df4ff2253-serving-cert\") pod \"insights-operator-585dfdc468-xfzqb\" (UID: \"d1b7bcd1-e58f-42c3-9a78-a06df4ff2253\") " pod="openshift-insights/insights-operator-585dfdc468-xfzqb" Apr 24 21:16:33.433310 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.433157 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whmpr\" (UniqueName: \"kubernetes.io/projected/2cd953b8-a92d-4621-a038-746bab77ff9f-kube-api-access-whmpr\") pod \"volume-data-source-validator-7c6cbb6c87-fvtdb\" (UID: \"2cd953b8-a92d-4621-a038-746bab77ff9f\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fvtdb" Apr 24 21:16:33.433310 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.433188 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6039cd07-a35a-4794-af31-da75ea5a3fa6-serving-cert\") pod \"service-ca-operator-d6fc45fc5-k25j5\" (UID: \"6039cd07-a35a-4794-af31-da75ea5a3fa6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-k25j5" Apr 24 21:16:33.433310 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.433236 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d1b7bcd1-e58f-42c3-9a78-a06df4ff2253-tmp\") pod \"insights-operator-585dfdc468-xfzqb\" (UID: \"d1b7bcd1-e58f-42c3-9a78-a06df4ff2253\") " pod="openshift-insights/insights-operator-585dfdc468-xfzqb" Apr 24 21:16:33.433310 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.433296 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/d1b7bcd1-e58f-42c3-9a78-a06df4ff2253-snapshots\") pod \"insights-operator-585dfdc468-xfzqb\" (UID: \"d1b7bcd1-e58f-42c3-9a78-a06df4ff2253\") " pod="openshift-insights/insights-operator-585dfdc468-xfzqb" Apr 24 21:16:33.433479 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.433331 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf9ml\" (UniqueName: \"kubernetes.io/projected/a12daad7-f5b8-4a50-9f97-aa1b6d379708-kube-api-access-nf9ml\") pod \"managed-serviceaccount-addon-agent-667bf47b7-wq8b5\" (UID: \"a12daad7-f5b8-4a50-9f97-aa1b6d379708\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667bf47b7-wq8b5" Apr 24 21:16:33.433479 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.433365 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d799708d-6592-4222-b0e7-a25a20dc584e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58\" (UID: \"d799708d-6592-4222-b0e7-a25a20dc584e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" Apr 24 21:16:33.433479 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.433391 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d799708d-6592-4222-b0e7-a25a20dc584e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58\" (UID: \"d799708d-6592-4222-b0e7-a25a20dc584e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" Apr 24 21:16:33.433479 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.433417 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1b7bcd1-e58f-42c3-9a78-a06df4ff2253-service-ca-bundle\") pod \"insights-operator-585dfdc468-xfzqb\" (UID: \"d1b7bcd1-e58f-42c3-9a78-a06df4ff2253\") " pod="openshift-insights/insights-operator-585dfdc468-xfzqb" Apr 24 21:16:33.433479 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.433468 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6039cd07-a35a-4794-af31-da75ea5a3fa6-config\") pod \"service-ca-operator-d6fc45fc5-k25j5\" (UID: \"6039cd07-a35a-4794-af31-da75ea5a3fa6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-k25j5" Apr 24 21:16:33.433770 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.433492 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hgv2\" (UniqueName: \"kubernetes.io/projected/6039cd07-a35a-4794-af31-da75ea5a3fa6-kube-api-access-6hgv2\") pod \"service-ca-operator-d6fc45fc5-k25j5\" (UID: \"6039cd07-a35a-4794-af31-da75ea5a3fa6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-k25j5" Apr 24 21:16:33.433770 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.433517 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/680befb0-2e56-4df6-b7ca-58caea84d887-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-499sv\" (UID: \"680befb0-2e56-4df6-b7ca-58caea84d887\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-499sv" Apr 24 21:16:33.433770 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.433541 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e187095c-23db-4e09-af90-8e136f238cec-config\") pod \"console-operator-9d4b6777b-c7lrn\" (UID: \"e187095c-23db-4e09-af90-8e136f238cec\") " pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" Apr 24 21:16:33.433770 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.433635 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d799708d-6592-4222-b0e7-a25a20dc584e-hub\") pod \"cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58\" (UID: \"d799708d-6592-4222-b0e7-a25a20dc584e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" Apr 24 21:16:33.433972 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.433839 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc8m9\" (UniqueName: \"kubernetes.io/projected/d1b7bcd1-e58f-42c3-9a78-a06df4ff2253-kube-api-access-bc8m9\") pod \"insights-operator-585dfdc468-xfzqb\" (UID: \"d1b7bcd1-e58f-42c3-9a78-a06df4ff2253\") " pod="openshift-insights/insights-operator-585dfdc468-xfzqb" Apr 24 21:16:33.433972 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.433874 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e187095c-23db-4e09-af90-8e136f238cec-serving-cert\") pod \"console-operator-9d4b6777b-c7lrn\" (UID: \"e187095c-23db-4e09-af90-8e136f238cec\") " pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" Apr 24 21:16:33.433972 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.433900 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7ss6\" (UniqueName: \"kubernetes.io/projected/e187095c-23db-4e09-af90-8e136f238cec-kube-api-access-d7ss6\") pod \"console-operator-9d4b6777b-c7lrn\" (UID: \"e187095c-23db-4e09-af90-8e136f238cec\") " pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" Apr 24 21:16:33.433972 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.433965 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fszb\" (UniqueName: \"kubernetes.io/projected/d799708d-6592-4222-b0e7-a25a20dc584e-kube-api-access-6fszb\") pod \"cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58\" (UID: \"d799708d-6592-4222-b0e7-a25a20dc584e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" Apr 24 21:16:33.434187 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.434005 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a12daad7-f5b8-4a50-9f97-aa1b6d379708-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-667bf47b7-wq8b5\" (UID: \"a12daad7-f5b8-4a50-9f97-aa1b6d379708\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667bf47b7-wq8b5" Apr 24 21:16:33.434271 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.434250 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/680befb0-2e56-4df6-b7ca-58caea84d887-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-499sv\" (UID: \"680befb0-2e56-4df6-b7ca-58caea84d887\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-499sv" Apr 24 21:16:33.434326 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.434289 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e187095c-23db-4e09-af90-8e136f238cec-trusted-ca\") pod \"console-operator-9d4b6777b-c7lrn\" (UID: \"e187095c-23db-4e09-af90-8e136f238cec\") " pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" Apr 24 21:16:33.434326 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.434302 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d799708d-6592-4222-b0e7-a25a20dc584e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58\" (UID: \"d799708d-6592-4222-b0e7-a25a20dc584e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" Apr 24 21:16:33.437992 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.437817 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d799708d-6592-4222-b0e7-a25a20dc584e-ca\") pod \"cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58\" (UID: \"d799708d-6592-4222-b0e7-a25a20dc584e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" Apr 24 21:16:33.437992 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.437943 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d799708d-6592-4222-b0e7-a25a20dc584e-hub\") pod \"cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58\" (UID: \"d799708d-6592-4222-b0e7-a25a20dc584e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" Apr 24 21:16:33.437992 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.437940 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d799708d-6592-4222-b0e7-a25a20dc584e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58\" (UID: \"d799708d-6592-4222-b0e7-a25a20dc584e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" Apr 24 21:16:33.437992 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.437957 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d799708d-6592-4222-b0e7-a25a20dc584e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58\" (UID: \"d799708d-6592-4222-b0e7-a25a20dc584e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" Apr 24 21:16:33.441889 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.441864 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fszb\" (UniqueName: \"kubernetes.io/projected/d799708d-6592-4222-b0e7-a25a20dc584e-kube-api-access-6fszb\") pod \"cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58\" (UID: \"d799708d-6592-4222-b0e7-a25a20dc584e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" Apr 24 21:16:33.442706 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.442683 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5"] Apr 24 21:16:33.442852 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.442836 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7q4x" Apr 24 21:16:33.445925 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.445900 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-vjpgj\"" Apr 24 21:16:33.446277 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.446255 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:33.446367 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.446294 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:33.448756 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.448579 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 21:16:33.451830 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.451811 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dfdcff997-vlddk"] Apr 24 21:16:33.451962 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.451948 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5" Apr 24 21:16:33.454005 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.453984 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 21:16:33.454005 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.453998 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-55vb9\"" Apr 24 21:16:33.454142 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.454014 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:16:33.454142 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.454091 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:16:33.454852 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.454837 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 21:16:33.467377 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.467361 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5fdf56dbd-s82zg"] Apr 24 21:16:33.467525 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.467510 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dfdcff997-vlddk" Apr 24 21:16:33.469475 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.469460 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 21:16:33.481521 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.481503 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-c7lrn"] Apr 24 21:16:33.481611 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.481570 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58"] Apr 24 21:16:33.481611 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.481585 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:33.481611 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.481598 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-499sv"] Apr 24 21:16:33.481611 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.481613 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fvtdb"] Apr 24 21:16:33.481804 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.481625 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667bf47b7-wq8b5"] Apr 24 21:16:33.481804 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.481636 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-k25j5"] Apr 24 21:16:33.481804 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.481647 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6c69778f6c-6k4sq"] Apr 24 21:16:33.481804 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.481660 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-68586bbdd8-8kw45"] Apr 24 21:16:33.481804 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.481676 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-pxf27"] Apr 24 21:16:33.483705 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.483687 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 21:16:33.483799 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.483728 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 21:16:33.483996 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.483982 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 21:16:33.483996 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.483987 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 21:16:33.484118 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.484000 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 21:16:33.484118 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.484023 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 21:16:33.484197 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.484122 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-qjdj8\"" Apr 24 21:16:33.493656 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.493641 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-w76jt"] Apr 24 21:16:33.493804 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.493790 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pxf27" Apr 24 21:16:33.495755 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.495736 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:16:33.496067 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.496052 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:16:33.496118 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.496081 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rt6g8\"" Apr 24 21:16:33.496166 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.496059 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:16:33.506077 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.506057 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5fdf56dbd-s82zg"] Apr 24 21:16:33.506077 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.506079 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5"] Apr 24 21:16:33.506240 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.506090 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-xfzqb"] Apr 24 21:16:33.506240 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.506103 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w76jt"] Apr 24 21:16:33.506240 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.506116 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dfdcff997-vlddk"] Apr 24 21:16:33.506240 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.506187 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w76jt" Apr 24 21:16:33.506240 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.506196 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-kc9q6"] Apr 24 21:16:33.506240 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.506206 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z962g"] Apr 24 21:16:33.506240 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.506222 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pxf27"] Apr 24 21:16:33.506604 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.506330 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7q4x"] Apr 24 21:16:33.508437 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.508411 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:16:33.508534 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.508416 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vh4z7\"" Apr 24 21:16:33.508534 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.508416 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:16:33.535358 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.535338 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d77d521f-bac2-47f1-80bd-1a4c7f08c799-ca-trust-extracted\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.535460 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.535375 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0a690ce8-242f-4216-9ac1-7a4d0f94784b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k7q4x\" (UID: \"0a690ce8-242f-4216-9ac1-7a4d0f94784b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7q4x" Apr 24 21:16:33.535460 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.535410 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d1b7bcd1-e58f-42c3-9a78-a06df4ff2253-tmp\") pod \"insights-operator-585dfdc468-xfzqb\" (UID: \"d1b7bcd1-e58f-42c3-9a78-a06df4ff2253\") " pod="openshift-insights/insights-operator-585dfdc468-xfzqb" Apr 24 21:16:33.535460 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.535455 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ff062d15-1ff3-4d8b-92be-3341e5f59abb-installation-pull-secrets\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.535597 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.535474 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/d1b7bcd1-e58f-42c3-9a78-a06df4ff2253-snapshots\") pod \"insights-operator-585dfdc468-xfzqb\" (UID: \"d1b7bcd1-e58f-42c3-9a78-a06df4ff2253\") " pod="openshift-insights/insights-operator-585dfdc468-xfzqb" Apr 24 21:16:33.535597 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.535490 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdc2r\" (UniqueName: \"kubernetes.io/projected/cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45-kube-api-access-zdc2r\") pod \"kube-storage-version-migrator-operator-6769c5d45-z962g\" (UID: \"cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z962g" Apr 24 21:16:33.535597 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.535537 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-certificates\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.535736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.535606 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nf9ml\" (UniqueName: \"kubernetes.io/projected/a12daad7-f5b8-4a50-9f97-aa1b6d379708-kube-api-access-nf9ml\") pod \"managed-serviceaccount-addon-agent-667bf47b7-wq8b5\" (UID: \"a12daad7-f5b8-4a50-9f97-aa1b6d379708\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667bf47b7-wq8b5" Apr 24 21:16:33.535736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.535648 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ff062d15-1ff3-4d8b-92be-3341e5f59abb-image-registry-private-configuration\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.535736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.535678 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff062d15-1ff3-4d8b-92be-3341e5f59abb-trusted-ca\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.535736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.535705 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d1b7bcd1-e58f-42c3-9a78-a06df4ff2253-tmp\") pod \"insights-operator-585dfdc468-xfzqb\" (UID: \"d1b7bcd1-e58f-42c3-9a78-a06df4ff2253\") " pod="openshift-insights/insights-operator-585dfdc468-xfzqb" Apr 24 21:16:33.535928 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.535751 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-z962g\" (UID: \"cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z962g" Apr 24 21:16:33.535928 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.535779 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fxtg\" (UniqueName: \"kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-kube-api-access-2fxtg\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.535928 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.535799 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a12daad7-f5b8-4a50-9f97-aa1b6d379708-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-667bf47b7-wq8b5\" (UID: \"a12daad7-f5b8-4a50-9f97-aa1b6d379708\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667bf47b7-wq8b5" Apr 24 21:16:33.535928 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.535817 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.535928 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.535855 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/680befb0-2e56-4df6-b7ca-58caea84d887-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-499sv\" (UID: \"680befb0-2e56-4df6-b7ca-58caea84d887\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-499sv" Apr 24 21:16:33.535928 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.535881 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e187095c-23db-4e09-af90-8e136f238cec-trusted-ca\") pod \"console-operator-9d4b6777b-c7lrn\" (UID: \"e187095c-23db-4e09-af90-8e136f238cec\") " pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" Apr 24 21:16:33.535928 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.535900 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d77d521f-bac2-47f1-80bd-1a4c7f08c799-image-registry-private-configuration\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.535928 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.535928 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/680befb0-2e56-4df6-b7ca-58caea84d887-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-499sv\" (UID: \"680befb0-2e56-4df6-b7ca-58caea84d887\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-499sv" Apr 24 21:16:33.536274 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.535932 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/d1b7bcd1-e58f-42c3-9a78-a06df4ff2253-snapshots\") pod \"insights-operator-585dfdc468-xfzqb\" (UID: \"d1b7bcd1-e58f-42c3-9a78-a06df4ff2253\") " pod="openshift-insights/insights-operator-585dfdc468-xfzqb" Apr 24 21:16:33.536274 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:33.535990 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:16:33.536274 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.535989 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-bound-sa-token\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.536274 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.536035 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d77d521f-bac2-47f1-80bd-1a4c7f08c799-trusted-ca\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.536274 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:33.536044 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/680befb0-2e56-4df6-b7ca-58caea84d887-networking-console-plugin-cert podName:680befb0-2e56-4df6-b7ca-58caea84d887 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:34.036026008 +0000 UTC m=+34.714862658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/680befb0-2e56-4df6-b7ca-58caea84d887-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-499sv" (UID: "680befb0-2e56-4df6-b7ca-58caea84d887") : secret "networking-console-plugin-cert" not found Apr 24 21:16:33.536274 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.536086 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q7sm\" (UniqueName: \"kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-kube-api-access-6q7sm\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.536274 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.536134 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d77d521f-bac2-47f1-80bd-1a4c7f08c799-installation-pull-secrets\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.536274 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.536169 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ff062d15-1ff3-4d8b-92be-3341e5f59abb-ca-trust-extracted\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.536274 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.536215 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6fz5\" (UniqueName: \"kubernetes.io/projected/0a690ce8-242f-4216-9ac1-7a4d0f94784b-kube-api-access-c6fz5\") pod \"cluster-samples-operator-6dc5bdb6b4-k7q4x\" (UID: \"0a690ce8-242f-4216-9ac1-7a4d0f94784b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7q4x" Apr 24 21:16:33.536744 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.536279 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1b7bcd1-e58f-42c3-9a78-a06df4ff2253-service-ca-bundle\") pod \"insights-operator-585dfdc468-xfzqb\" (UID: \"d1b7bcd1-e58f-42c3-9a78-a06df4ff2253\") " pod="openshift-insights/insights-operator-585dfdc468-xfzqb" Apr 24 21:16:33.536744 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.536332 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-certificates\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.536744 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.536370 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.536744 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.536447 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bc8m9\" (UniqueName: \"kubernetes.io/projected/d1b7bcd1-e58f-42c3-9a78-a06df4ff2253-kube-api-access-bc8m9\") pod \"insights-operator-585dfdc468-xfzqb\" (UID: \"d1b7bcd1-e58f-42c3-9a78-a06df4ff2253\") " pod="openshift-insights/insights-operator-585dfdc468-xfzqb" Apr 24 21:16:33.536744 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.536484 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7ss6\" (UniqueName: \"kubernetes.io/projected/e187095c-23db-4e09-af90-8e136f238cec-kube-api-access-d7ss6\") pod \"console-operator-9d4b6777b-c7lrn\" (UID: \"e187095c-23db-4e09-af90-8e136f238cec\") " pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" Apr 24 21:16:33.536744 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.536517 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e187095c-23db-4e09-af90-8e136f238cec-serving-cert\") pod \"console-operator-9d4b6777b-c7lrn\" (UID: \"e187095c-23db-4e09-af90-8e136f238cec\") " pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" Apr 24 21:16:33.536744 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.536572 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-z962g\" (UID: \"cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z962g" Apr 24 21:16:33.536744 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.536631 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/680befb0-2e56-4df6-b7ca-58caea84d887-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-499sv\" (UID: \"680befb0-2e56-4df6-b7ca-58caea84d887\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-499sv" Apr 24 21:16:33.536744 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.536633 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfsvk\" (UniqueName: \"kubernetes.io/projected/76435f4f-785c-4dce-912c-13fbc131a04a-kube-api-access-nfsvk\") pod \"network-check-source-8894fc9bd-kc9q6\" (UID: \"76435f4f-785c-4dce-912c-13fbc131a04a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kc9q6" Apr 24 21:16:33.536744 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.536679 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6039cd07-a35a-4794-af31-da75ea5a3fa6-config\") pod \"service-ca-operator-d6fc45fc5-k25j5\" (UID: \"6039cd07-a35a-4794-af31-da75ea5a3fa6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-k25j5" Apr 24 21:16:33.536744 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.536703 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hgv2\" (UniqueName: \"kubernetes.io/projected/6039cd07-a35a-4794-af31-da75ea5a3fa6-kube-api-access-6hgv2\") pod \"service-ca-operator-d6fc45fc5-k25j5\" (UID: \"6039cd07-a35a-4794-af31-da75ea5a3fa6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-k25j5" Apr 24 21:16:33.537275 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.536757 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e187095c-23db-4e09-af90-8e136f238cec-config\") pod \"console-operator-9d4b6777b-c7lrn\" (UID: \"e187095c-23db-4e09-af90-8e136f238cec\") " pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" Apr 24 21:16:33.537275 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.536792 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1b7bcd1-e58f-42c3-9a78-a06df4ff2253-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-xfzqb\" (UID: \"d1b7bcd1-e58f-42c3-9a78-a06df4ff2253\") " pod="openshift-insights/insights-operator-585dfdc468-xfzqb" Apr 24 21:16:33.537275 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.536858 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1b7bcd1-e58f-42c3-9a78-a06df4ff2253-serving-cert\") pod \"insights-operator-585dfdc468-xfzqb\" (UID: \"d1b7bcd1-e58f-42c3-9a78-a06df4ff2253\") " pod="openshift-insights/insights-operator-585dfdc468-xfzqb" Apr 24 21:16:33.537275 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.536890 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-bound-sa-token\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.537275 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.536954 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1b7bcd1-e58f-42c3-9a78-a06df4ff2253-service-ca-bundle\") pod \"insights-operator-585dfdc468-xfzqb\" (UID: \"d1b7bcd1-e58f-42c3-9a78-a06df4ff2253\") " pod="openshift-insights/insights-operator-585dfdc468-xfzqb" Apr 24 21:16:33.537275 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.536944 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whmpr\" (UniqueName: \"kubernetes.io/projected/2cd953b8-a92d-4621-a038-746bab77ff9f-kube-api-access-whmpr\") pod \"volume-data-source-validator-7c6cbb6c87-fvtdb\" (UID: \"2cd953b8-a92d-4621-a038-746bab77ff9f\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fvtdb" Apr 24 21:16:33.537275 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.537003 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6039cd07-a35a-4794-af31-da75ea5a3fa6-serving-cert\") pod \"service-ca-operator-d6fc45fc5-k25j5\" (UID: \"6039cd07-a35a-4794-af31-da75ea5a3fa6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-k25j5" Apr 24 21:16:33.537275 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.537263 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6039cd07-a35a-4794-af31-da75ea5a3fa6-config\") pod \"service-ca-operator-d6fc45fc5-k25j5\" (UID: \"6039cd07-a35a-4794-af31-da75ea5a3fa6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-k25j5" Apr 24 21:16:33.541442 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.538878 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e187095c-23db-4e09-af90-8e136f238cec-trusted-ca\") pod \"console-operator-9d4b6777b-c7lrn\" (UID: \"e187095c-23db-4e09-af90-8e136f238cec\") " pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" Apr 24 21:16:33.541442 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.539257 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e187095c-23db-4e09-af90-8e136f238cec-serving-cert\") pod \"console-operator-9d4b6777b-c7lrn\" (UID: \"e187095c-23db-4e09-af90-8e136f238cec\") " pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" Apr 24 21:16:33.541442 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.540213 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6039cd07-a35a-4794-af31-da75ea5a3fa6-serving-cert\") pod \"service-ca-operator-d6fc45fc5-k25j5\" (UID: \"6039cd07-a35a-4794-af31-da75ea5a3fa6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-k25j5" Apr 24 21:16:33.542446 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.542055 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1b7bcd1-e58f-42c3-9a78-a06df4ff2253-serving-cert\") pod \"insights-operator-585dfdc468-xfzqb\" (UID: \"d1b7bcd1-e58f-42c3-9a78-a06df4ff2253\") " pod="openshift-insights/insights-operator-585dfdc468-xfzqb" Apr 24 21:16:33.543343 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.543023 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a12daad7-f5b8-4a50-9f97-aa1b6d379708-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-667bf47b7-wq8b5\" (UID: \"a12daad7-f5b8-4a50-9f97-aa1b6d379708\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667bf47b7-wq8b5" Apr 24 21:16:33.543343 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.543117 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e187095c-23db-4e09-af90-8e136f238cec-config\") pod \"console-operator-9d4b6777b-c7lrn\" (UID: \"e187095c-23db-4e09-af90-8e136f238cec\") " pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" Apr 24 21:16:33.543954 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.543931 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1b7bcd1-e58f-42c3-9a78-a06df4ff2253-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-xfzqb\" (UID: \"d1b7bcd1-e58f-42c3-9a78-a06df4ff2253\") " pod="openshift-insights/insights-operator-585dfdc468-xfzqb" Apr 24 21:16:33.544827 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.544506 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" Apr 24 21:16:33.544827 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.544744 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf9ml\" (UniqueName: \"kubernetes.io/projected/a12daad7-f5b8-4a50-9f97-aa1b6d379708-kube-api-access-nf9ml\") pod \"managed-serviceaccount-addon-agent-667bf47b7-wq8b5\" (UID: \"a12daad7-f5b8-4a50-9f97-aa1b6d379708\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667bf47b7-wq8b5" Apr 24 21:16:33.546919 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.546895 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc8m9\" (UniqueName: \"kubernetes.io/projected/d1b7bcd1-e58f-42c3-9a78-a06df4ff2253-kube-api-access-bc8m9\") pod \"insights-operator-585dfdc468-xfzqb\" (UID: \"d1b7bcd1-e58f-42c3-9a78-a06df4ff2253\") " pod="openshift-insights/insights-operator-585dfdc468-xfzqb" Apr 24 21:16:33.547098 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.547077 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hgv2\" (UniqueName: \"kubernetes.io/projected/6039cd07-a35a-4794-af31-da75ea5a3fa6-kube-api-access-6hgv2\") pod \"service-ca-operator-d6fc45fc5-k25j5\" (UID: \"6039cd07-a35a-4794-af31-da75ea5a3fa6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-k25j5" Apr 24 21:16:33.547651 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.547634 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7ss6\" (UniqueName: \"kubernetes.io/projected/e187095c-23db-4e09-af90-8e136f238cec-kube-api-access-d7ss6\") pod \"console-operator-9d4b6777b-c7lrn\" (UID: \"e187095c-23db-4e09-af90-8e136f238cec\") " pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" Apr 24 21:16:33.548037 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.548012 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-whmpr\" (UniqueName: \"kubernetes.io/projected/2cd953b8-a92d-4621-a038-746bab77ff9f-kube-api-access-whmpr\") pod \"volume-data-source-validator-7c6cbb6c87-fvtdb\" (UID: \"2cd953b8-a92d-4621-a038-746bab77ff9f\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fvtdb" Apr 24 21:16:33.554681 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.554655 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" Apr 24 21:16:33.592520 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.592485 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-k25j5" Apr 24 21:16:33.610384 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.610360 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667bf47b7-wq8b5" Apr 24 21:16:33.637323 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.637291 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-xfzqb" Apr 24 21:16:33.637789 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.637538 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6fz5\" (UniqueName: \"kubernetes.io/projected/0a690ce8-242f-4216-9ac1-7a4d0f94784b-kube-api-access-c6fz5\") pod \"cluster-samples-operator-6dc5bdb6b4-k7q4x\" (UID: \"0a690ce8-242f-4216-9ac1-7a4d0f94784b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7q4x" Apr 24 21:16:33.637789 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.637584 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b2a5ce52-613f-45fa-b7c6-83240f376eb7-klusterlet-config\") pod \"klusterlet-addon-workmgr-6dfdcff997-vlddk\" (UID: \"b2a5ce52-613f-45fa-b7c6-83240f376eb7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dfdcff997-vlddk" Apr 24 21:16:33.637789 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.637622 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-metrics-certs\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:33.637789 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.637652 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-certificates\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.637789 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.637680 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.637789 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.637708 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22c2w\" (UniqueName: \"kubernetes.io/projected/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-kube-api-access-22c2w\") pod \"dns-default-w76jt\" (UID: \"d79fb269-2ec1-4e09-a8d6-9b2a367d21b6\") " pod="openshift-dns/dns-default-w76jt" Apr 24 21:16:33.638155 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638014 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/543220ca-e10a-465a-a5ee-a24026536361-service-ca-bundle\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:33.638155 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638067 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-z962g\" (UID: \"cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z962g" Apr 24 21:16:33.638155 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638101 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hb75q\" (UniqueName: \"kubernetes.io/projected/9018a4db-1967-45ae-8ad3-7fdd04d6a4d1-kube-api-access-hb75q\") pod \"network-check-target-5shjj\" (UID: \"9018a4db-1967-45ae-8ad3-7fdd04d6a4d1\") " pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:33.638155 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638128 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nfsvk\" (UniqueName: \"kubernetes.io/projected/76435f4f-785c-4dce-912c-13fbc131a04a-kube-api-access-nfsvk\") pod \"network-check-source-8894fc9bd-kc9q6\" (UID: \"76435f4f-785c-4dce-912c-13fbc131a04a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kc9q6" Apr 24 21:16:33.638155 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638155 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aefd5dcc-7f58-4fab-8028-2cffcff95339-cert\") pod \"ingress-canary-pxf27\" (UID: \"aefd5dcc-7f58-4fab-8028-2cffcff95339\") " pod="openshift-ingress-canary/ingress-canary-pxf27" Apr 24 21:16:33.638409 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638181 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-tmp-dir\") pod \"dns-default-w76jt\" (UID: \"d79fb269-2ec1-4e09-a8d6-9b2a367d21b6\") " pod="openshift-dns/dns-default-w76jt" Apr 24 21:16:33.638409 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638208 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a4c82719-9c98-4a75-864d-75fb12509cb1-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-lhgp5\" (UID: \"a4c82719-9c98-4a75-864d-75fb12509cb1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5" Apr 24 21:16:33.638409 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638253 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-default-certificate\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:33.638409 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638277 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-metrics-tls\") pod \"dns-default-w76jt\" (UID: \"d79fb269-2ec1-4e09-a8d6-9b2a367d21b6\") " pod="openshift-dns/dns-default-w76jt" Apr 24 21:16:33.638409 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638307 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-bound-sa-token\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.638409 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638336 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d77d521f-bac2-47f1-80bd-1a4c7f08c799-ca-trust-extracted\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.638409 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638366 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0a690ce8-242f-4216-9ac1-7a4d0f94784b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k7q4x\" (UID: \"0a690ce8-242f-4216-9ac1-7a4d0f94784b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7q4x" Apr 24 21:16:33.638409 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638393 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-stats-auth\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:33.638822 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638443 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ff062d15-1ff3-4d8b-92be-3341e5f59abb-installation-pull-secrets\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.638822 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638467 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-config-volume\") pod \"dns-default-w76jt\" (UID: \"d79fb269-2ec1-4e09-a8d6-9b2a367d21b6\") " pod="openshift-dns/dns-default-w76jt" Apr 24 21:16:33.638822 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638496 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdc2r\" (UniqueName: \"kubernetes.io/projected/cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45-kube-api-access-zdc2r\") pod \"kube-storage-version-migrator-operator-6769c5d45-z962g\" (UID: \"cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z962g" Apr 24 21:16:33.638822 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638531 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-certificates\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.638822 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638562 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs\") pod \"network-metrics-daemon-n487x\" (UID: \"657a2c9b-4e75-4d61-bff2-d8abdd05825d\") " pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:33.638822 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638591 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdpk8\" (UniqueName: \"kubernetes.io/projected/a4c82719-9c98-4a75-864d-75fb12509cb1-kube-api-access-kdpk8\") pod \"cluster-monitoring-operator-75587bd455-lhgp5\" (UID: \"a4c82719-9c98-4a75-864d-75fb12509cb1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5" Apr 24 21:16:33.638822 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638623 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ff062d15-1ff3-4d8b-92be-3341e5f59abb-image-registry-private-configuration\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.638822 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638670 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff062d15-1ff3-4d8b-92be-3341e5f59abb-trusted-ca\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.638822 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638701 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-certificates\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.638822 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:33.638131 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:33.638822 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:33.638741 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c69778f6c-6k4sq: secret "image-registry-tls" not found Apr 24 21:16:33.639568 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638848 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr2kj\" (UniqueName: \"kubernetes.io/projected/b2a5ce52-613f-45fa-b7c6-83240f376eb7-kube-api-access-mr2kj\") pod \"klusterlet-addon-workmgr-6dfdcff997-vlddk\" (UID: \"b2a5ce52-613f-45fa-b7c6-83240f376eb7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dfdcff997-vlddk" Apr 24 21:16:33.639568 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638880 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjn6p\" (UniqueName: \"kubernetes.io/projected/aefd5dcc-7f58-4fab-8028-2cffcff95339-kube-api-access-kjn6p\") pod \"ingress-canary-pxf27\" (UID: \"aefd5dcc-7f58-4fab-8028-2cffcff95339\") " pod="openshift-ingress-canary/ingress-canary-pxf27" Apr 24 21:16:33.639568 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:33.638889 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:33.639568 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638910 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-z962g\" (UID: \"cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z962g" Apr 24 21:16:33.639568 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:33.638947 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs podName:657a2c9b-4e75-4d61-bff2-d8abdd05825d nodeName:}" failed. No retries permitted until 2026-04-24 21:17:05.63892845 +0000 UTC m=+66.317765104 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs") pod "network-metrics-daemon-n487x" (UID: "657a2c9b-4e75-4d61-bff2-d8abdd05825d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:33.639568 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.638969 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fxtg\" (UniqueName: \"kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-kube-api-access-2fxtg\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.639568 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.639007 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.639568 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:33.639155 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:33.639568 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:33.639170 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68586bbdd8-8kw45: secret "image-registry-tls" not found Apr 24 21:16:33.639568 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:33.639223 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls podName:d77d521f-bac2-47f1-80bd-1a4c7f08c799 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:34.139203859 +0000 UTC m=+34.818040540 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls") pod "image-registry-68586bbdd8-8kw45" (UID: "d77d521f-bac2-47f1-80bd-1a4c7f08c799") : secret "image-registry-tls" not found Apr 24 21:16:33.639568 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.639390 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d77d521f-bac2-47f1-80bd-1a4c7f08c799-image-registry-private-configuration\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.640146 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:33.639836 2581 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:16:33.640146 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:33.639915 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a690ce8-242f-4216-9ac1-7a4d0f94784b-samples-operator-tls podName:0a690ce8-242f-4216-9ac1-7a4d0f94784b nodeName:}" failed. No retries permitted until 2026-04-24 21:16:34.139896561 +0000 UTC m=+34.818733210 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0a690ce8-242f-4216-9ac1-7a4d0f94784b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-k7q4x" (UID: "0a690ce8-242f-4216-9ac1-7a4d0f94784b") : secret "samples-operator-tls" not found Apr 24 21:16:33.640257 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.640212 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d77d521f-bac2-47f1-80bd-1a4c7f08c799-ca-trust-extracted\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.640257 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:33.640226 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls podName:ff062d15-1ff3-4d8b-92be-3341e5f59abb nodeName:}" failed. No retries permitted until 2026-04-24 21:16:34.140209071 +0000 UTC m=+34.819045737 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls") pod "image-registry-6c69778f6c-6k4sq" (UID: "ff062d15-1ff3-4d8b-92be-3341e5f59abb") : secret "image-registry-tls" not found Apr 24 21:16:33.640910 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.640359 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2a5ce52-613f-45fa-b7c6-83240f376eb7-tmp\") pod \"klusterlet-addon-workmgr-6dfdcff997-vlddk\" (UID: \"b2a5ce52-613f-45fa-b7c6-83240f376eb7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dfdcff997-vlddk" Apr 24 21:16:33.640910 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.640408 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-bound-sa-token\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.640910 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.640453 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d77d521f-bac2-47f1-80bd-1a4c7f08c799-trusted-ca\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.640910 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.640481 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6q7sm\" (UniqueName: \"kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-kube-api-access-6q7sm\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.640910 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.640512 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4c82719-9c98-4a75-864d-75fb12509cb1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lhgp5\" (UID: \"a4c82719-9c98-4a75-864d-75fb12509cb1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5" Apr 24 21:16:33.640910 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.640546 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d77d521f-bac2-47f1-80bd-1a4c7f08c799-installation-pull-secrets\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.640910 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.640572 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ff062d15-1ff3-4d8b-92be-3341e5f59abb-ca-trust-extracted\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.640910 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.640606 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4dkz\" (UniqueName: \"kubernetes.io/projected/543220ca-e10a-465a-a5ee-a24026536361-kube-api-access-x4dkz\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:33.640910 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.640835 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-z962g\" (UID: \"cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z962g" Apr 24 21:16:33.642680 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.642189 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d77d521f-bac2-47f1-80bd-1a4c7f08c799-trusted-ca\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.643261 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.643234 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ff062d15-1ff3-4d8b-92be-3341e5f59abb-ca-trust-extracted\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.643792 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.643766 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-certificates\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.648895 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.644600 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff062d15-1ff3-4d8b-92be-3341e5f59abb-trusted-ca\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.648895 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.646083 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-z962g\" (UID: \"cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z962g" Apr 24 21:16:33.648895 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.646356 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ff062d15-1ff3-4d8b-92be-3341e5f59abb-image-registry-private-configuration\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.648895 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.646931 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d77d521f-bac2-47f1-80bd-1a4c7f08c799-installation-pull-secrets\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.649714 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.649665 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fxtg\" (UniqueName: \"kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-kube-api-access-2fxtg\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.650661 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.650620 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ff062d15-1ff3-4d8b-92be-3341e5f59abb-installation-pull-secrets\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.651928 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.651542 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fvtdb" Apr 24 21:16:33.652440 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.652087 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6fz5\" (UniqueName: \"kubernetes.io/projected/0a690ce8-242f-4216-9ac1-7a4d0f94784b-kube-api-access-c6fz5\") pod \"cluster-samples-operator-6dc5bdb6b4-k7q4x\" (UID: \"0a690ce8-242f-4216-9ac1-7a4d0f94784b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7q4x" Apr 24 21:16:33.652511 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.652485 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d77d521f-bac2-47f1-80bd-1a4c7f08c799-image-registry-private-configuration\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.653205 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.653161 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdc2r\" (UniqueName: \"kubernetes.io/projected/cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45-kube-api-access-zdc2r\") pod \"kube-storage-version-migrator-operator-6769c5d45-z962g\" (UID: \"cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z962g" Apr 24 21:16:33.655057 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.654255 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q7sm\" (UniqueName: \"kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-kube-api-access-6q7sm\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.655057 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.654642 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-bound-sa-token\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:33.655057 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.655015 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb75q\" (UniqueName: \"kubernetes.io/projected/9018a4db-1967-45ae-8ad3-7fdd04d6a4d1-kube-api-access-hb75q\") pod \"network-check-target-5shjj\" (UID: \"9018a4db-1967-45ae-8ad3-7fdd04d6a4d1\") " pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:33.655749 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.655724 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfsvk\" (UniqueName: \"kubernetes.io/projected/76435f4f-785c-4dce-912c-13fbc131a04a-kube-api-access-nfsvk\") pod \"network-check-source-8894fc9bd-kc9q6\" (UID: \"76435f4f-785c-4dce-912c-13fbc131a04a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kc9q6" Apr 24 21:16:33.657104 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.657000 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-bound-sa-token\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:33.674345 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.673722 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z962g" Apr 24 21:16:33.708856 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.708679 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58"] Apr 24 21:16:33.711626 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:33.711502 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd799708d_6592_4222_b0e7_a25a20dc584e.slice/crio-829ddf6c9fb114e2eeff8cefaefb1e2e0d5d655466e3cc181f72b3b03631472f WatchSource:0}: Error finding container 829ddf6c9fb114e2eeff8cefaefb1e2e0d5d655466e3cc181f72b3b03631472f: Status 404 returned error can't find the container with id 829ddf6c9fb114e2eeff8cefaefb1e2e0d5d655466e3cc181f72b3b03631472f Apr 24 21:16:33.712194 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.712178 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-c7lrn"] Apr 24 21:16:33.715076 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:33.715052 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode187095c_23db_4e09_af90_8e136f238cec.slice/crio-16ee8d5ce388d6e54bc4bb9ccd8bf73f9572ab84a1b28e574f1e1f46b9618995 WatchSource:0}: Error finding container 16ee8d5ce388d6e54bc4bb9ccd8bf73f9572ab84a1b28e574f1e1f46b9618995: Status 404 returned error can't find the container with id 16ee8d5ce388d6e54bc4bb9ccd8bf73f9572ab84a1b28e574f1e1f46b9618995 Apr 24 21:16:33.716581 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.716565 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kc9q6" Apr 24 21:16:33.741131 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.741104 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2a5ce52-613f-45fa-b7c6-83240f376eb7-tmp\") pod \"klusterlet-addon-workmgr-6dfdcff997-vlddk\" (UID: \"b2a5ce52-613f-45fa-b7c6-83240f376eb7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dfdcff997-vlddk" Apr 24 21:16:33.741247 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.741151 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4c82719-9c98-4a75-864d-75fb12509cb1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lhgp5\" (UID: \"a4c82719-9c98-4a75-864d-75fb12509cb1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5" Apr 24 21:16:33.741247 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.741186 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4dkz\" (UniqueName: \"kubernetes.io/projected/543220ca-e10a-465a-a5ee-a24026536361-kube-api-access-x4dkz\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:33.741247 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.741221 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b2a5ce52-613f-45fa-b7c6-83240f376eb7-klusterlet-config\") pod \"klusterlet-addon-workmgr-6dfdcff997-vlddk\" (UID: \"b2a5ce52-613f-45fa-b7c6-83240f376eb7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dfdcff997-vlddk" Apr 24 21:16:33.741405 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.741254 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-metrics-certs\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:33.741405 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.741298 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22c2w\" (UniqueName: \"kubernetes.io/projected/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-kube-api-access-22c2w\") pod \"dns-default-w76jt\" (UID: \"d79fb269-2ec1-4e09-a8d6-9b2a367d21b6\") " pod="openshift-dns/dns-default-w76jt" Apr 24 21:16:33.741405 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:33.741306 2581 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:33.741405 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.741328 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/543220ca-e10a-465a-a5ee-a24026536361-service-ca-bundle\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:33.741405 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.741362 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aefd5dcc-7f58-4fab-8028-2cffcff95339-cert\") pod \"ingress-canary-pxf27\" (UID: \"aefd5dcc-7f58-4fab-8028-2cffcff95339\") " pod="openshift-ingress-canary/ingress-canary-pxf27" Apr 24 21:16:33.741405 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:33.741382 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4c82719-9c98-4a75-864d-75fb12509cb1-cluster-monitoring-operator-tls podName:a4c82719-9c98-4a75-864d-75fb12509cb1 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:34.241362724 +0000 UTC m=+34.920199391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a4c82719-9c98-4a75-864d-75fb12509cb1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lhgp5" (UID: "a4c82719-9c98-4a75-864d-75fb12509cb1") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:33.741768 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.741418 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-tmp-dir\") pod \"dns-default-w76jt\" (UID: \"d79fb269-2ec1-4e09-a8d6-9b2a367d21b6\") " pod="openshift-dns/dns-default-w76jt" Apr 24 21:16:33.741768 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.741467 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a4c82719-9c98-4a75-864d-75fb12509cb1-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-lhgp5\" (UID: \"a4c82719-9c98-4a75-864d-75fb12509cb1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5" Apr 24 21:16:33.741768 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:33.741506 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:33.741768 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.741511 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-default-certificate\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:33.741768 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.741539 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-metrics-tls\") pod \"dns-default-w76jt\" (UID: \"d79fb269-2ec1-4e09-a8d6-9b2a367d21b6\") " pod="openshift-dns/dns-default-w76jt" Apr 24 21:16:33.741768 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:33.741554 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aefd5dcc-7f58-4fab-8028-2cffcff95339-cert podName:aefd5dcc-7f58-4fab-8028-2cffcff95339 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:34.241538598 +0000 UTC m=+34.920375248 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aefd5dcc-7f58-4fab-8028-2cffcff95339-cert") pod "ingress-canary-pxf27" (UID: "aefd5dcc-7f58-4fab-8028-2cffcff95339") : secret "canary-serving-cert" not found Apr 24 21:16:33.741768 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:33.741610 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:33.741768 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:33.741623 2581 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:16:33.741768 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:33.741651 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-metrics-tls podName:d79fb269-2ec1-4e09-a8d6-9b2a367d21b6 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:34.241635358 +0000 UTC m=+34.920472024 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-metrics-tls") pod "dns-default-w76jt" (UID: "d79fb269-2ec1-4e09-a8d6-9b2a367d21b6") : secret "dns-default-metrics-tls" not found Apr 24 21:16:33.741768 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:33.741678 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-metrics-certs podName:543220ca-e10a-465a-a5ee-a24026536361 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:34.241659277 +0000 UTC m=+34.920495948 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-metrics-certs") pod "router-default-5fdf56dbd-s82zg" (UID: "543220ca-e10a-465a-a5ee-a24026536361") : secret "router-metrics-certs-default" not found Apr 24 21:16:33.741768 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.741716 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-stats-auth\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:33.741768 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.741748 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-config-volume\") pod \"dns-default-w76jt\" (UID: \"d79fb269-2ec1-4e09-a8d6-9b2a367d21b6\") " pod="openshift-dns/dns-default-w76jt" Apr 24 21:16:33.742300 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.741549 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2a5ce52-613f-45fa-b7c6-83240f376eb7-tmp\") pod \"klusterlet-addon-workmgr-6dfdcff997-vlddk\" (UID: \"b2a5ce52-613f-45fa-b7c6-83240f376eb7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dfdcff997-vlddk" Apr 24 21:16:33.742300 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.741795 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdpk8\" (UniqueName: \"kubernetes.io/projected/a4c82719-9c98-4a75-864d-75fb12509cb1-kube-api-access-kdpk8\") pod \"cluster-monitoring-operator-75587bd455-lhgp5\" (UID: \"a4c82719-9c98-4a75-864d-75fb12509cb1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5" Apr 24 21:16:33.742300 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.741830 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mr2kj\" (UniqueName: \"kubernetes.io/projected/b2a5ce52-613f-45fa-b7c6-83240f376eb7-kube-api-access-mr2kj\") pod \"klusterlet-addon-workmgr-6dfdcff997-vlddk\" (UID: \"b2a5ce52-613f-45fa-b7c6-83240f376eb7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dfdcff997-vlddk" Apr 24 21:16:33.742300 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.741860 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjn6p\" (UniqueName: \"kubernetes.io/projected/aefd5dcc-7f58-4fab-8028-2cffcff95339-kube-api-access-kjn6p\") pod \"ingress-canary-pxf27\" (UID: \"aefd5dcc-7f58-4fab-8028-2cffcff95339\") " pod="openshift-ingress-canary/ingress-canary-pxf27" Apr 24 21:16:33.742300 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.741921 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-tmp-dir\") pod \"dns-default-w76jt\" (UID: \"d79fb269-2ec1-4e09-a8d6-9b2a367d21b6\") " pod="openshift-dns/dns-default-w76jt" Apr 24 21:16:33.742300 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:33.741965 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/543220ca-e10a-465a-a5ee-a24026536361-service-ca-bundle podName:543220ca-e10a-465a-a5ee-a24026536361 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:34.241950341 +0000 UTC m=+34.920786994 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/543220ca-e10a-465a-a5ee-a24026536361-service-ca-bundle") pod "router-default-5fdf56dbd-s82zg" (UID: "543220ca-e10a-465a-a5ee-a24026536361") : configmap references non-existent config key: service-ca.crt Apr 24 21:16:33.742594 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.742351 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a4c82719-9c98-4a75-864d-75fb12509cb1-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-lhgp5\" (UID: \"a4c82719-9c98-4a75-864d-75fb12509cb1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5" Apr 24 21:16:33.742897 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.742782 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-config-volume\") pod \"dns-default-w76jt\" (UID: \"d79fb269-2ec1-4e09-a8d6-9b2a367d21b6\") " pod="openshift-dns/dns-default-w76jt" Apr 24 21:16:33.745346 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.745305 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-default-certificate\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:33.745658 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.745530 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b2a5ce52-613f-45fa-b7c6-83240f376eb7-klusterlet-config\") pod \"klusterlet-addon-workmgr-6dfdcff997-vlddk\" (UID: \"b2a5ce52-613f-45fa-b7c6-83240f376eb7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dfdcff997-vlddk" Apr 24 21:16:33.745907 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.745884 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-stats-auth\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:33.752970 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.752915 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22c2w\" (UniqueName: \"kubernetes.io/projected/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-kube-api-access-22c2w\") pod \"dns-default-w76jt\" (UID: \"d79fb269-2ec1-4e09-a8d6-9b2a367d21b6\") " pod="openshift-dns/dns-default-w76jt" Apr 24 21:16:33.753473 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.753196 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4dkz\" (UniqueName: \"kubernetes.io/projected/543220ca-e10a-465a-a5ee-a24026536361-kube-api-access-x4dkz\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:33.754478 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.754417 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr2kj\" (UniqueName: \"kubernetes.io/projected/b2a5ce52-613f-45fa-b7c6-83240f376eb7-kube-api-access-mr2kj\") pod \"klusterlet-addon-workmgr-6dfdcff997-vlddk\" (UID: \"b2a5ce52-613f-45fa-b7c6-83240f376eb7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dfdcff997-vlddk" Apr 24 21:16:33.756023 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.755980 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjn6p\" (UniqueName: \"kubernetes.io/projected/aefd5dcc-7f58-4fab-8028-2cffcff95339-kube-api-access-kjn6p\") pod \"ingress-canary-pxf27\" (UID: \"aefd5dcc-7f58-4fab-8028-2cffcff95339\") " pod="openshift-ingress-canary/ingress-canary-pxf27" Apr 24 21:16:33.761015 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.760975 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdpk8\" (UniqueName: \"kubernetes.io/projected/a4c82719-9c98-4a75-864d-75fb12509cb1-kube-api-access-kdpk8\") pod \"cluster-monitoring-operator-75587bd455-lhgp5\" (UID: \"a4c82719-9c98-4a75-864d-75fb12509cb1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5" Apr 24 21:16:33.776056 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.775691 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dfdcff997-vlddk" Apr 24 21:16:33.946918 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.945537 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-k25j5"] Apr 24 21:16:33.948807 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:33.948767 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6039cd07_a35a_4794_af31_da75ea5a3fa6.slice/crio-3eb9bcb5da00fc370ade8f4dbd1f521fa3d7766dda4e9326e1b97e936d40baab WatchSource:0}: Error finding container 3eb9bcb5da00fc370ade8f4dbd1f521fa3d7766dda4e9326e1b97e936d40baab: Status 404 returned error can't find the container with id 3eb9bcb5da00fc370ade8f4dbd1f521fa3d7766dda4e9326e1b97e936d40baab Apr 24 21:16:33.993370 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:33.993158 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-xfzqb"] Apr 24 21:16:33.997618 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:33.997210 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b7bcd1_e58f_42c3_9a78_a06df4ff2253.slice/crio-f730f8bea4c94cb7b33e743ea6769d6b9062d05d251dd0f5b0441aa0d6a0726f WatchSource:0}: Error finding container f730f8bea4c94cb7b33e743ea6769d6b9062d05d251dd0f5b0441aa0d6a0726f: Status 404 returned error can't find the container with id f730f8bea4c94cb7b33e743ea6769d6b9062d05d251dd0f5b0441aa0d6a0726f Apr 24 21:16:34.004737 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.004717 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dfdcff997-vlddk"] Apr 24 21:16:34.009505 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.009480 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-kc9q6"] Apr 24 21:16:34.009797 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:34.009772 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2a5ce52_613f_45fa_b7c6_83240f376eb7.slice/crio-61175954fb7c1f3dee77669748527f5958b1ba1599304e527f0d69b9bc06bfb7 WatchSource:0}: Error finding container 61175954fb7c1f3dee77669748527f5958b1ba1599304e527f0d69b9bc06bfb7: Status 404 returned error can't find the container with id 61175954fb7c1f3dee77669748527f5958b1ba1599304e527f0d69b9bc06bfb7 Apr 24 21:16:34.012488 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:34.012460 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76435f4f_785c_4dce_912c_13fbc131a04a.slice/crio-d816ab64d78d5dfdf0982be9b7b50fd37f72cb25d543c923b999ab2420ef9d6e WatchSource:0}: Error finding container d816ab64d78d5dfdf0982be9b7b50fd37f72cb25d543c923b999ab2420ef9d6e: Status 404 returned error can't find the container with id d816ab64d78d5dfdf0982be9b7b50fd37f72cb25d543c923b999ab2420ef9d6e Apr 24 21:16:34.018249 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.018102 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z962g"] Apr 24 21:16:34.019173 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.019151 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fvtdb"] Apr 24 21:16:34.020846 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:34.020824 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdd306f5_b3ab_47c4_ac0d_ba9ba28c5e45.slice/crio-a5bd392b5bfc1c455f1a7c13dd13de3f27f329e78d208289b2dacd1036701cbc WatchSource:0}: Error finding container a5bd392b5bfc1c455f1a7c13dd13de3f27f329e78d208289b2dacd1036701cbc: Status 404 returned error can't find the container with id a5bd392b5bfc1c455f1a7c13dd13de3f27f329e78d208289b2dacd1036701cbc Apr 24 21:16:34.021834 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.021820 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667bf47b7-wq8b5"] Apr 24 21:16:34.022144 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:34.022123 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cd953b8_a92d_4621_a038_746bab77ff9f.slice/crio-e486a1cb0033885a22450c68dad5fd0c6b26c39841933f017dc1109c123d0929 WatchSource:0}: Error finding container e486a1cb0033885a22450c68dad5fd0c6b26c39841933f017dc1109c123d0929: Status 404 returned error can't find the container with id e486a1cb0033885a22450c68dad5fd0c6b26c39841933f017dc1109c123d0929 Apr 24 21:16:34.025058 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:34.025032 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda12daad7_f5b8_4a50_9f97_aa1b6d379708.slice/crio-3f493dec1cc3b823c043ab2a86b9c4b93a1484f75b99539ccaff08ed46017f97 WatchSource:0}: Error finding container 3f493dec1cc3b823c043ab2a86b9c4b93a1484f75b99539ccaff08ed46017f97: Status 404 returned error can't find the container with id 3f493dec1cc3b823c043ab2a86b9c4b93a1484f75b99539ccaff08ed46017f97 Apr 24 21:16:34.025058 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.025036 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-k25j5" event={"ID":"6039cd07-a35a-4794-af31-da75ea5a3fa6","Type":"ContainerStarted","Data":"3eb9bcb5da00fc370ade8f4dbd1f521fa3d7766dda4e9326e1b97e936d40baab"} Apr 24 21:16:34.026142 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.026123 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" event={"ID":"e187095c-23db-4e09-af90-8e136f238cec","Type":"ContainerStarted","Data":"16ee8d5ce388d6e54bc4bb9ccd8bf73f9572ab84a1b28e574f1e1f46b9618995"} Apr 24 21:16:34.027218 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.027183 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dfdcff997-vlddk" event={"ID":"b2a5ce52-613f-45fa-b7c6-83240f376eb7","Type":"ContainerStarted","Data":"61175954fb7c1f3dee77669748527f5958b1ba1599304e527f0d69b9bc06bfb7"} Apr 24 21:16:34.028200 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.028176 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z962g" event={"ID":"cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45","Type":"ContainerStarted","Data":"a5bd392b5bfc1c455f1a7c13dd13de3f27f329e78d208289b2dacd1036701cbc"} Apr 24 21:16:34.029158 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.029132 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" event={"ID":"d799708d-6592-4222-b0e7-a25a20dc584e","Type":"ContainerStarted","Data":"829ddf6c9fb114e2eeff8cefaefb1e2e0d5d655466e3cc181f72b3b03631472f"} Apr 24 21:16:34.029983 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.029960 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kc9q6" event={"ID":"76435f4f-785c-4dce-912c-13fbc131a04a","Type":"ContainerStarted","Data":"d816ab64d78d5dfdf0982be9b7b50fd37f72cb25d543c923b999ab2420ef9d6e"} Apr 24 21:16:34.030867 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.030846 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-xfzqb" event={"ID":"d1b7bcd1-e58f-42c3-9a78-a06df4ff2253","Type":"ContainerStarted","Data":"f730f8bea4c94cb7b33e743ea6769d6b9062d05d251dd0f5b0441aa0d6a0726f"} Apr 24 21:16:34.048271 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.048249 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/680befb0-2e56-4df6-b7ca-58caea84d887-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-499sv\" (UID: \"680befb0-2e56-4df6-b7ca-58caea84d887\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-499sv" Apr 24 21:16:34.048370 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:34.048354 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:16:34.048415 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:34.048405 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/680befb0-2e56-4df6-b7ca-58caea84d887-networking-console-plugin-cert podName:680befb0-2e56-4df6-b7ca-58caea84d887 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:35.04839148 +0000 UTC m=+35.727228130 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/680befb0-2e56-4df6-b7ca-58caea84d887-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-499sv" (UID: "680befb0-2e56-4df6-b7ca-58caea84d887") : secret "networking-console-plugin-cert" not found Apr 24 21:16:34.148744 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.148573 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:34.148890 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.148795 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:34.148890 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:34.148721 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:34.148890 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:34.148819 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68586bbdd8-8kw45: secret "image-registry-tls" not found Apr 24 21:16:34.148890 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.148854 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0a690ce8-242f-4216-9ac1-7a4d0f94784b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k7q4x\" (UID: \"0a690ce8-242f-4216-9ac1-7a4d0f94784b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7q4x" Apr 24 21:16:34.148890 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:34.148870 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls podName:d77d521f-bac2-47f1-80bd-1a4c7f08c799 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:35.148856028 +0000 UTC m=+35.827692678 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls") pod "image-registry-68586bbdd8-8kw45" (UID: "d77d521f-bac2-47f1-80bd-1a4c7f08c799") : secret "image-registry-tls" not found Apr 24 21:16:34.149123 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:34.148924 2581 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:16:34.149123 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:34.148960 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:34.149123 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:34.148980 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c69778f6c-6k4sq: secret "image-registry-tls" not found Apr 24 21:16:34.149123 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:34.148971 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a690ce8-242f-4216-9ac1-7a4d0f94784b-samples-operator-tls podName:0a690ce8-242f-4216-9ac1-7a4d0f94784b nodeName:}" failed. No retries permitted until 2026-04-24 21:16:35.148960059 +0000 UTC m=+35.827796710 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0a690ce8-242f-4216-9ac1-7a4d0f94784b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-k7q4x" (UID: "0a690ce8-242f-4216-9ac1-7a4d0f94784b") : secret "samples-operator-tls" not found Apr 24 21:16:34.149123 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:34.149050 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls podName:ff062d15-1ff3-4d8b-92be-3341e5f59abb nodeName:}" failed. No retries permitted until 2026-04-24 21:16:35.149036638 +0000 UTC m=+35.827873288 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls") pod "image-registry-6c69778f6c-6k4sq" (UID: "ff062d15-1ff3-4d8b-92be-3341e5f59abb") : secret "image-registry-tls" not found Apr 24 21:16:34.250049 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.250016 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-metrics-certs\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:34.250178 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.250066 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/543220ca-e10a-465a-a5ee-a24026536361-service-ca-bundle\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:34.250178 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:34.250167 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/543220ca-e10a-465a-a5ee-a24026536361-service-ca-bundle podName:543220ca-e10a-465a-a5ee-a24026536361 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:35.250154873 +0000 UTC m=+35.928991524 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/543220ca-e10a-465a-a5ee-a24026536361-service-ca-bundle") pod "router-default-5fdf56dbd-s82zg" (UID: "543220ca-e10a-465a-a5ee-a24026536361") : configmap references non-existent config key: service-ca.crt Apr 24 21:16:34.250327 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.250183 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aefd5dcc-7f58-4fab-8028-2cffcff95339-cert\") pod \"ingress-canary-pxf27\" (UID: \"aefd5dcc-7f58-4fab-8028-2cffcff95339\") " pod="openshift-ingress-canary/ingress-canary-pxf27" Apr 24 21:16:34.250327 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:34.250189 2581 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:16:34.250327 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.250226 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-metrics-tls\") pod \"dns-default-w76jt\" (UID: \"d79fb269-2ec1-4e09-a8d6-9b2a367d21b6\") " pod="openshift-dns/dns-default-w76jt" Apr 24 21:16:34.250327 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:34.250256 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-metrics-certs podName:543220ca-e10a-465a-a5ee-a24026536361 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:35.25023987 +0000 UTC m=+35.929076523 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-metrics-certs") pod "router-default-5fdf56dbd-s82zg" (UID: "543220ca-e10a-465a-a5ee-a24026536361") : secret "router-metrics-certs-default" not found Apr 24 21:16:34.250535 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:34.250332 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:34.250535 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:34.250382 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aefd5dcc-7f58-4fab-8028-2cffcff95339-cert podName:aefd5dcc-7f58-4fab-8028-2cffcff95339 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:35.250366864 +0000 UTC m=+35.929203529 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aefd5dcc-7f58-4fab-8028-2cffcff95339-cert") pod "ingress-canary-pxf27" (UID: "aefd5dcc-7f58-4fab-8028-2cffcff95339") : secret "canary-serving-cert" not found Apr 24 21:16:34.250535 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:34.250391 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:34.250535 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.250445 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4c82719-9c98-4a75-864d-75fb12509cb1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lhgp5\" (UID: \"a4c82719-9c98-4a75-864d-75fb12509cb1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5" Apr 24 21:16:34.250535 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:34.250489 2581 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:34.250535 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:34.250526 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-metrics-tls podName:d79fb269-2ec1-4e09-a8d6-9b2a367d21b6 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:35.25051053 +0000 UTC m=+35.929347188 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-metrics-tls") pod "dns-default-w76jt" (UID: "d79fb269-2ec1-4e09-a8d6-9b2a367d21b6") : secret "dns-default-metrics-tls" not found Apr 24 21:16:34.250730 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:34.250551 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4c82719-9c98-4a75-864d-75fb12509cb1-cluster-monitoring-operator-tls podName:a4c82719-9c98-4a75-864d-75fb12509cb1 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:35.250542058 +0000 UTC m=+35.929378708 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a4c82719-9c98-4a75-864d-75fb12509cb1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lhgp5" (UID: "a4c82719-9c98-4a75-864d-75fb12509cb1") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:34.870814 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.869687 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:34.870814 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.870148 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:16:34.875356 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.874995 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:16:34.875881 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.875610 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-8v42p\"" Apr 24 21:16:34.875881 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.875747 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dd9qw\"" Apr 24 21:16:34.889360 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:34.888975 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:35.046980 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:35.046945 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667bf47b7-wq8b5" event={"ID":"a12daad7-f5b8-4a50-9f97-aa1b6d379708","Type":"ContainerStarted","Data":"3f493dec1cc3b823c043ab2a86b9c4b93a1484f75b99539ccaff08ed46017f97"} Apr 24 21:16:35.049399 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:35.049339 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fvtdb" event={"ID":"2cd953b8-a92d-4621-a038-746bab77ff9f","Type":"ContainerStarted","Data":"e486a1cb0033885a22450c68dad5fd0c6b26c39841933f017dc1109c123d0929"} Apr 24 21:16:35.061128 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:35.059491 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/680befb0-2e56-4df6-b7ca-58caea84d887-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-499sv\" (UID: \"680befb0-2e56-4df6-b7ca-58caea84d887\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-499sv" Apr 24 21:16:35.061128 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:35.059771 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:16:35.061128 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:35.059831 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/680befb0-2e56-4df6-b7ca-58caea84d887-networking-console-plugin-cert podName:680befb0-2e56-4df6-b7ca-58caea84d887 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:37.059812591 +0000 UTC m=+37.738649247 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/680befb0-2e56-4df6-b7ca-58caea84d887-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-499sv" (UID: "680befb0-2e56-4df6-b7ca-58caea84d887") : secret "networking-console-plugin-cert" not found Apr 24 21:16:35.061128 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:35.060263 2581 generic.go:358] "Generic (PLEG): container finished" podID="c61fee18-e272-4bf5-aa08-65392bba68b6" containerID="be2b5059606e08df3881a326569bfa247c54ed309435d030bfd9bba717d04336" exitCode=0 Apr 24 21:16:35.061128 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:35.060297 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5shvq" event={"ID":"c61fee18-e272-4bf5-aa08-65392bba68b6","Type":"ContainerDied","Data":"be2b5059606e08df3881a326569bfa247c54ed309435d030bfd9bba717d04336"} Apr 24 21:16:35.093000 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:35.092957 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5shjj"] Apr 24 21:16:35.163114 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:35.160877 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:35.163114 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:35.161003 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0a690ce8-242f-4216-9ac1-7a4d0f94784b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k7q4x\" (UID: \"0a690ce8-242f-4216-9ac1-7a4d0f94784b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7q4x" Apr 24 21:16:35.163114 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:35.161065 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:35.163114 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:35.161878 2581 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:16:35.163114 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:35.161941 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a690ce8-242f-4216-9ac1-7a4d0f94784b-samples-operator-tls podName:0a690ce8-242f-4216-9ac1-7a4d0f94784b nodeName:}" failed. No retries permitted until 2026-04-24 21:16:37.161921313 +0000 UTC m=+37.840757970 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0a690ce8-242f-4216-9ac1-7a4d0f94784b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-k7q4x" (UID: "0a690ce8-242f-4216-9ac1-7a4d0f94784b") : secret "samples-operator-tls" not found Apr 24 21:16:35.163114 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:35.162364 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:35.163114 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:35.162380 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c69778f6c-6k4sq: secret "image-registry-tls" not found Apr 24 21:16:35.163114 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:35.162439 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls podName:ff062d15-1ff3-4d8b-92be-3341e5f59abb nodeName:}" failed. No retries permitted until 2026-04-24 21:16:37.162410291 +0000 UTC m=+37.841246944 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls") pod "image-registry-6c69778f6c-6k4sq" (UID: "ff062d15-1ff3-4d8b-92be-3341e5f59abb") : secret "image-registry-tls" not found Apr 24 21:16:35.163114 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:35.162876 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:35.163114 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:35.162891 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68586bbdd8-8kw45: secret "image-registry-tls" not found Apr 24 21:16:35.163114 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:35.162926 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls podName:d77d521f-bac2-47f1-80bd-1a4c7f08c799 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:37.162912943 +0000 UTC m=+37.841749594 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls") pod "image-registry-68586bbdd8-8kw45" (UID: "d77d521f-bac2-47f1-80bd-1a4c7f08c799") : secret "image-registry-tls" not found Apr 24 21:16:35.263807 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:35.262599 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/543220ca-e10a-465a-a5ee-a24026536361-service-ca-bundle\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:35.263807 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:35.262655 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aefd5dcc-7f58-4fab-8028-2cffcff95339-cert\") pod \"ingress-canary-pxf27\" (UID: \"aefd5dcc-7f58-4fab-8028-2cffcff95339\") " pod="openshift-ingress-canary/ingress-canary-pxf27" Apr 24 21:16:35.263807 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:35.262701 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-metrics-tls\") pod \"dns-default-w76jt\" (UID: \"d79fb269-2ec1-4e09-a8d6-9b2a367d21b6\") " pod="openshift-dns/dns-default-w76jt" Apr 24 21:16:35.263807 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:35.262813 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4c82719-9c98-4a75-864d-75fb12509cb1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lhgp5\" (UID: \"a4c82719-9c98-4a75-864d-75fb12509cb1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5" Apr 24 21:16:35.263807 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:35.262862 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-metrics-certs\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:35.263807 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:35.263001 2581 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:16:35.263807 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:35.263057 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-metrics-certs podName:543220ca-e10a-465a-a5ee-a24026536361 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:37.263040703 +0000 UTC m=+37.941877357 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-metrics-certs") pod "router-default-5fdf56dbd-s82zg" (UID: "543220ca-e10a-465a-a5ee-a24026536361") : secret "router-metrics-certs-default" not found Apr 24 21:16:35.263807 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:35.263483 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/543220ca-e10a-465a-a5ee-a24026536361-service-ca-bundle podName:543220ca-e10a-465a-a5ee-a24026536361 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:37.263467131 +0000 UTC m=+37.942303801 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/543220ca-e10a-465a-a5ee-a24026536361-service-ca-bundle") pod "router-default-5fdf56dbd-s82zg" (UID: "543220ca-e10a-465a-a5ee-a24026536361") : configmap references non-existent config key: service-ca.crt Apr 24 21:16:35.263807 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:35.263565 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:35.263807 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:35.263597 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aefd5dcc-7f58-4fab-8028-2cffcff95339-cert podName:aefd5dcc-7f58-4fab-8028-2cffcff95339 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:37.263586463 +0000 UTC m=+37.942423113 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aefd5dcc-7f58-4fab-8028-2cffcff95339-cert") pod "ingress-canary-pxf27" (UID: "aefd5dcc-7f58-4fab-8028-2cffcff95339") : secret "canary-serving-cert" not found Apr 24 21:16:35.263807 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:35.263652 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:35.263807 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:35.263681 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-metrics-tls podName:d79fb269-2ec1-4e09-a8d6-9b2a367d21b6 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:37.263670765 +0000 UTC m=+37.942507417 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-metrics-tls") pod "dns-default-w76jt" (UID: "d79fb269-2ec1-4e09-a8d6-9b2a367d21b6") : secret "dns-default-metrics-tls" not found Apr 24 21:16:35.263807 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:35.263743 2581 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:35.264810 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:35.264738 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4c82719-9c98-4a75-864d-75fb12509cb1-cluster-monitoring-operator-tls podName:a4c82719-9c98-4a75-864d-75fb12509cb1 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:37.264719963 +0000 UTC m=+37.943556626 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a4c82719-9c98-4a75-864d-75fb12509cb1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lhgp5" (UID: "a4c82719-9c98-4a75-864d-75fb12509cb1") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:36.121435 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:36.121368 2581 generic.go:358] "Generic (PLEG): container finished" podID="c61fee18-e272-4bf5-aa08-65392bba68b6" containerID="ee09c7c095803d0a6b1bcb9ee9ae178b47f73e9b651757345f331abac2ce524c" exitCode=0 Apr 24 21:16:36.122558 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:36.121477 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5shvq" event={"ID":"c61fee18-e272-4bf5-aa08-65392bba68b6","Type":"ContainerDied","Data":"ee09c7c095803d0a6b1bcb9ee9ae178b47f73e9b651757345f331abac2ce524c"} Apr 24 21:16:36.146646 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:36.146582 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5shjj" event={"ID":"9018a4db-1967-45ae-8ad3-7fdd04d6a4d1","Type":"ContainerStarted","Data":"8917bf739c5a9a3bd904cfc97e59f8be353c68f9b524778e22903935e3eae4bd"} Apr 24 21:16:37.084442 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:37.084158 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/680befb0-2e56-4df6-b7ca-58caea84d887-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-499sv\" (UID: \"680befb0-2e56-4df6-b7ca-58caea84d887\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-499sv" Apr 24 21:16:37.084605 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:37.084546 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:16:37.084665 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:37.084614 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/680befb0-2e56-4df6-b7ca-58caea84d887-networking-console-plugin-cert podName:680befb0-2e56-4df6-b7ca-58caea84d887 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:41.084592632 +0000 UTC m=+41.763429288 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/680befb0-2e56-4df6-b7ca-58caea84d887-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-499sv" (UID: "680befb0-2e56-4df6-b7ca-58caea84d887") : secret "networking-console-plugin-cert" not found Apr 24 21:16:37.185719 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:37.185684 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:37.186238 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:37.185793 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:37.186238 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:37.185886 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0a690ce8-242f-4216-9ac1-7a4d0f94784b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k7q4x\" (UID: \"0a690ce8-242f-4216-9ac1-7a4d0f94784b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7q4x" Apr 24 21:16:37.186238 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:37.186034 2581 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:16:37.186238 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:37.186092 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a690ce8-242f-4216-9ac1-7a4d0f94784b-samples-operator-tls podName:0a690ce8-242f-4216-9ac1-7a4d0f94784b nodeName:}" failed. No retries permitted until 2026-04-24 21:16:41.18607175 +0000 UTC m=+41.864908402 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0a690ce8-242f-4216-9ac1-7a4d0f94784b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-k7q4x" (UID: "0a690ce8-242f-4216-9ac1-7a4d0f94784b") : secret "samples-operator-tls" not found Apr 24 21:16:37.186728 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:37.186563 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:37.186728 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:37.186583 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68586bbdd8-8kw45: secret "image-registry-tls" not found Apr 24 21:16:37.186728 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:37.186628 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls podName:d77d521f-bac2-47f1-80bd-1a4c7f08c799 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:41.186612242 +0000 UTC m=+41.865448895 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls") pod "image-registry-68586bbdd8-8kw45" (UID: "d77d521f-bac2-47f1-80bd-1a4c7f08c799") : secret "image-registry-tls" not found Apr 24 21:16:37.186728 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:37.186688 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:37.186728 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:37.186697 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c69778f6c-6k4sq: secret "image-registry-tls" not found Apr 24 21:16:37.186728 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:37.186728 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls podName:ff062d15-1ff3-4d8b-92be-3341e5f59abb nodeName:}" failed. No retries permitted until 2026-04-24 21:16:41.186718082 +0000 UTC m=+41.865554735 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls") pod "image-registry-6c69778f6c-6k4sq" (UID: "ff062d15-1ff3-4d8b-92be-3341e5f59abb") : secret "image-registry-tls" not found Apr 24 21:16:37.205983 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:37.205026 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5shvq" event={"ID":"c61fee18-e272-4bf5-aa08-65392bba68b6","Type":"ContainerStarted","Data":"202d5a8f0aa62c62c9cb6b274b44618267d14ca92c4cd38af4669a9cb01cfc3a"} Apr 24 21:16:37.287953 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:37.286940 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4c82719-9c98-4a75-864d-75fb12509cb1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lhgp5\" (UID: \"a4c82719-9c98-4a75-864d-75fb12509cb1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5" Apr 24 21:16:37.287953 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:37.287013 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-metrics-certs\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:37.287953 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:37.287054 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/543220ca-e10a-465a-a5ee-a24026536361-service-ca-bundle\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:37.287953 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:37.287084 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aefd5dcc-7f58-4fab-8028-2cffcff95339-cert\") pod \"ingress-canary-pxf27\" (UID: \"aefd5dcc-7f58-4fab-8028-2cffcff95339\") " pod="openshift-ingress-canary/ingress-canary-pxf27" Apr 24 21:16:37.287953 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:37.287134 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-metrics-tls\") pod \"dns-default-w76jt\" (UID: \"d79fb269-2ec1-4e09-a8d6-9b2a367d21b6\") " pod="openshift-dns/dns-default-w76jt" Apr 24 21:16:37.287953 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:37.287675 2581 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:37.287953 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:37.287722 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/543220ca-e10a-465a-a5ee-a24026536361-service-ca-bundle podName:543220ca-e10a-465a-a5ee-a24026536361 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:41.287703721 +0000 UTC m=+41.966540375 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/543220ca-e10a-465a-a5ee-a24026536361-service-ca-bundle") pod "router-default-5fdf56dbd-s82zg" (UID: "543220ca-e10a-465a-a5ee-a24026536361") : configmap references non-existent config key: service-ca.crt Apr 24 21:16:37.287953 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:37.287741 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4c82719-9c98-4a75-864d-75fb12509cb1-cluster-monitoring-operator-tls podName:a4c82719-9c98-4a75-864d-75fb12509cb1 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:41.287731662 +0000 UTC m=+41.966568315 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a4c82719-9c98-4a75-864d-75fb12509cb1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lhgp5" (UID: "a4c82719-9c98-4a75-864d-75fb12509cb1") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:37.287953 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:37.287750 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:37.287953 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:37.287778 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aefd5dcc-7f58-4fab-8028-2cffcff95339-cert podName:aefd5dcc-7f58-4fab-8028-2cffcff95339 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:41.287767958 +0000 UTC m=+41.966604608 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aefd5dcc-7f58-4fab-8028-2cffcff95339-cert") pod "ingress-canary-pxf27" (UID: "aefd5dcc-7f58-4fab-8028-2cffcff95339") : secret "canary-serving-cert" not found Apr 24 21:16:37.287953 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:37.287792 2581 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:16:37.287953 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:37.287821 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-metrics-certs podName:543220ca-e10a-465a-a5ee-a24026536361 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:41.287812367 +0000 UTC m=+41.966649017 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-metrics-certs") pod "router-default-5fdf56dbd-s82zg" (UID: "543220ca-e10a-465a-a5ee-a24026536361") : secret "router-metrics-certs-default" not found Apr 24 21:16:37.287953 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:37.287829 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:37.287953 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:37.287853 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-metrics-tls podName:d79fb269-2ec1-4e09-a8d6-9b2a367d21b6 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:41.287846025 +0000 UTC m=+41.966682675 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-metrics-tls") pod "dns-default-w76jt" (UID: "d79fb269-2ec1-4e09-a8d6-9b2a367d21b6") : secret "dns-default-metrics-tls" not found Apr 24 21:16:39.894666 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:39.894622 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5shvq" podStartSLOduration=8.851953215 podStartE2EDuration="39.894608174s" podCreationTimestamp="2026-04-24 21:16:00 +0000 UTC" firstStartedPulling="2026-04-24 21:16:02.708213757 +0000 UTC m=+3.387050411" lastFinishedPulling="2026-04-24 21:16:33.750868718 +0000 UTC m=+34.429705370" observedRunningTime="2026-04-24 21:16:37.229484121 +0000 UTC m=+37.908320794" watchObservedRunningTime="2026-04-24 21:16:39.894608174 +0000 UTC m=+40.573444845" Apr 24 21:16:41.122759 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:41.122570 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/680befb0-2e56-4df6-b7ca-58caea84d887-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-499sv\" (UID: \"680befb0-2e56-4df6-b7ca-58caea84d887\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-499sv" Apr 24 21:16:41.122759 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:41.122692 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:16:41.122759 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:41.122760 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/680befb0-2e56-4df6-b7ca-58caea84d887-networking-console-plugin-cert podName:680befb0-2e56-4df6-b7ca-58caea84d887 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:49.122742576 +0000 UTC m=+49.801579232 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/680befb0-2e56-4df6-b7ca-58caea84d887-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-499sv" (UID: "680befb0-2e56-4df6-b7ca-58caea84d887") : secret "networking-console-plugin-cert" not found Apr 24 21:16:41.224148 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:41.224117 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0a690ce8-242f-4216-9ac1-7a4d0f94784b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k7q4x\" (UID: \"0a690ce8-242f-4216-9ac1-7a4d0f94784b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7q4x" Apr 24 21:16:41.224317 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:41.224184 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:41.224317 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:41.224262 2581 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:16:41.224317 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:41.224279 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:41.224317 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:41.224311 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:41.224539 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:41.224330 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68586bbdd8-8kw45: secret "image-registry-tls" not found Apr 24 21:16:41.224539 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:41.224320 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a690ce8-242f-4216-9ac1-7a4d0f94784b-samples-operator-tls podName:0a690ce8-242f-4216-9ac1-7a4d0f94784b nodeName:}" failed. No retries permitted until 2026-04-24 21:16:49.224304027 +0000 UTC m=+49.903140677 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0a690ce8-242f-4216-9ac1-7a4d0f94784b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-k7q4x" (UID: "0a690ce8-242f-4216-9ac1-7a4d0f94784b") : secret "samples-operator-tls" not found Apr 24 21:16:41.224539 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:41.224381 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls podName:d77d521f-bac2-47f1-80bd-1a4c7f08c799 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:49.224369067 +0000 UTC m=+49.903205717 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls") pod "image-registry-68586bbdd8-8kw45" (UID: "d77d521f-bac2-47f1-80bd-1a4c7f08c799") : secret "image-registry-tls" not found Apr 24 21:16:41.224539 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:41.224383 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:41.224539 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:41.224397 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c69778f6c-6k4sq: secret "image-registry-tls" not found Apr 24 21:16:41.224539 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:41.224464 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls podName:ff062d15-1ff3-4d8b-92be-3341e5f59abb nodeName:}" failed. No retries permitted until 2026-04-24 21:16:49.224450387 +0000 UTC m=+49.903287041 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls") pod "image-registry-6c69778f6c-6k4sq" (UID: "ff062d15-1ff3-4d8b-92be-3341e5f59abb") : secret "image-registry-tls" not found Apr 24 21:16:41.325230 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:41.325203 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4c82719-9c98-4a75-864d-75fb12509cb1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lhgp5\" (UID: \"a4c82719-9c98-4a75-864d-75fb12509cb1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5" Apr 24 21:16:41.325404 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:41.325262 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-metrics-certs\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:41.325404 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:41.325302 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/543220ca-e10a-465a-a5ee-a24026536361-service-ca-bundle\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:41.325404 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:41.325326 2581 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:41.325404 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:41.325333 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aefd5dcc-7f58-4fab-8028-2cffcff95339-cert\") pod \"ingress-canary-pxf27\" (UID: \"aefd5dcc-7f58-4fab-8028-2cffcff95339\") " pod="openshift-ingress-canary/ingress-canary-pxf27" Apr 24 21:16:41.325404 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:41.325370 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-metrics-tls\") pod \"dns-default-w76jt\" (UID: \"d79fb269-2ec1-4e09-a8d6-9b2a367d21b6\") " pod="openshift-dns/dns-default-w76jt" Apr 24 21:16:41.325404 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:41.325380 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4c82719-9c98-4a75-864d-75fb12509cb1-cluster-monitoring-operator-tls podName:a4c82719-9c98-4a75-864d-75fb12509cb1 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:49.325368135 +0000 UTC m=+50.004204785 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a4c82719-9c98-4a75-864d-75fb12509cb1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lhgp5" (UID: "a4c82719-9c98-4a75-864d-75fb12509cb1") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:41.325676 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:41.325407 2581 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:16:41.325676 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:41.325420 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/543220ca-e10a-465a-a5ee-a24026536361-service-ca-bundle podName:543220ca-e10a-465a-a5ee-a24026536361 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:49.325407571 +0000 UTC m=+50.004244221 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/543220ca-e10a-465a-a5ee-a24026536361-service-ca-bundle") pod "router-default-5fdf56dbd-s82zg" (UID: "543220ca-e10a-465a-a5ee-a24026536361") : configmap references non-existent config key: service-ca.crt Apr 24 21:16:41.325676 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:41.325463 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:41.325676 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:41.325473 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-metrics-certs podName:543220ca-e10a-465a-a5ee-a24026536361 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:49.325457557 +0000 UTC m=+50.004294209 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-metrics-certs") pod "router-default-5fdf56dbd-s82zg" (UID: "543220ca-e10a-465a-a5ee-a24026536361") : secret "router-metrics-certs-default" not found Apr 24 21:16:41.325676 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:41.325489 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:41.325676 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:41.325504 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-metrics-tls podName:d79fb269-2ec1-4e09-a8d6-9b2a367d21b6 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:49.325490944 +0000 UTC m=+50.004327598 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-metrics-tls") pod "dns-default-w76jt" (UID: "d79fb269-2ec1-4e09-a8d6-9b2a367d21b6") : secret "dns-default-metrics-tls" not found Apr 24 21:16:41.325676 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:41.325519 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aefd5dcc-7f58-4fab-8028-2cffcff95339-cert podName:aefd5dcc-7f58-4fab-8028-2cffcff95339 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:49.325511762 +0000 UTC m=+50.004348417 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aefd5dcc-7f58-4fab-8028-2cffcff95339-cert") pod "ingress-canary-pxf27" (UID: "aefd5dcc-7f58-4fab-8028-2cffcff95339") : secret "canary-serving-cert" not found Apr 24 21:16:47.236614 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:47.236483 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5shjj" event={"ID":"9018a4db-1967-45ae-8ad3-7fdd04d6a4d1","Type":"ContainerStarted","Data":"3e535e54d4075ddd40e1c807c55162964d7c5136b86dc999a4b199269e983a3b"} Apr 24 21:16:47.237391 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:47.237322 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:16:47.241943 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:47.241257 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kc9q6" event={"ID":"76435f4f-785c-4dce-912c-13fbc131a04a","Type":"ContainerStarted","Data":"c6042a22f6345cfb870141db12216f40d82d01e9fbb20e7724f48c1a12189325"} Apr 24 21:16:47.244183 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:47.243701 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-xfzqb" event={"ID":"d1b7bcd1-e58f-42c3-9a78-a06df4ff2253","Type":"ContainerStarted","Data":"389c82caffb0d8370bbc1f5b36a010a444dba0dbe1213457688f8ed14be5146b"} Apr 24 21:16:47.247572 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:47.246868 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667bf47b7-wq8b5" event={"ID":"a12daad7-f5b8-4a50-9f97-aa1b6d379708","Type":"ContainerStarted","Data":"d6e2cea436ecf57571d110aaf760883f9679273e909dd023142f8853b96e6032"} Apr 24 21:16:47.250359 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:47.249789 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-k25j5" event={"ID":"6039cd07-a35a-4794-af31-da75ea5a3fa6","Type":"ContainerStarted","Data":"33dba131003e42c603eea4a2f027428329e5efd358ba4b4fa1e96405aab65ba2"} Apr 24 21:16:47.253300 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:47.251527 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dfdcff997-vlddk" event={"ID":"b2a5ce52-613f-45fa-b7c6-83240f376eb7","Type":"ContainerStarted","Data":"43c2ff0fcf1fbd69f336e1b6268d4f645e3df110e2664ec02d4f718d62bb2218"} Apr 24 21:16:47.256464 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:47.253416 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dfdcff997-vlddk" Apr 24 21:16:47.256464 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:47.255025 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-5shjj" podStartSLOduration=35.326994614 podStartE2EDuration="47.255011431s" podCreationTimestamp="2026-04-24 21:16:00 +0000 UTC" firstStartedPulling="2026-04-24 21:16:35.101961339 +0000 UTC m=+35.780798004" lastFinishedPulling="2026-04-24 21:16:47.029978156 +0000 UTC m=+47.708814821" observedRunningTime="2026-04-24 21:16:47.253859878 +0000 UTC m=+47.932696554" watchObservedRunningTime="2026-04-24 21:16:47.255011431 +0000 UTC m=+47.933848104" Apr 24 21:16:47.259219 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:47.258743 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fvtdb" event={"ID":"2cd953b8-a92d-4621-a038-746bab77ff9f","Type":"ContainerStarted","Data":"634517ae2178d34fa440f54edf0ab19b2e232bc0fd55cbaddeac8e967f82b5b1"} Apr 24 21:16:47.259550 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:47.259471 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dfdcff997-vlddk" podUID="b2a5ce52-613f-45fa-b7c6-83240f376eb7" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.18:8000/readyz\": dial tcp 10.132.0.18:8000: connect: connection refused" Apr 24 21:16:47.295617 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:47.292980 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kc9q6" podStartSLOduration=28.278767237 podStartE2EDuration="41.292962352s" podCreationTimestamp="2026-04-24 21:16:06 +0000 UTC" firstStartedPulling="2026-04-24 21:16:34.01451135 +0000 UTC m=+34.693347999" lastFinishedPulling="2026-04-24 21:16:47.028706454 +0000 UTC m=+47.707543114" observedRunningTime="2026-04-24 21:16:47.292360622 +0000 UTC m=+47.971197295" watchObservedRunningTime="2026-04-24 21:16:47.292962352 +0000 UTC m=+47.971799026" Apr 24 21:16:47.295617 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:47.294110 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667bf47b7-wq8b5" podStartSLOduration=22.292429081 podStartE2EDuration="35.29409741s" podCreationTimestamp="2026-04-24 21:16:12 +0000 UTC" firstStartedPulling="2026-04-24 21:16:34.02705071 +0000 UTC m=+34.705887360" lastFinishedPulling="2026-04-24 21:16:47.028719028 +0000 UTC m=+47.707555689" observedRunningTime="2026-04-24 21:16:47.276920861 +0000 UTC m=+47.955757533" watchObservedRunningTime="2026-04-24 21:16:47.29409741 +0000 UTC m=+47.972934083" Apr 24 21:16:47.335757 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:47.335703 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-k25j5" podStartSLOduration=28.258919636 podStartE2EDuration="41.335684504s" podCreationTimestamp="2026-04-24 21:16:06 +0000 UTC" firstStartedPulling="2026-04-24 21:16:33.951195364 +0000 UTC m=+34.630032015" lastFinishedPulling="2026-04-24 21:16:47.02796022 +0000 UTC m=+47.706796883" observedRunningTime="2026-04-24 21:16:47.312492718 +0000 UTC m=+47.991329390" watchObservedRunningTime="2026-04-24 21:16:47.335684504 +0000 UTC m=+48.014521178" Apr 24 21:16:47.336031 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:47.335996 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-xfzqb" podStartSLOduration=28.743679692 podStartE2EDuration="41.335984924s" podCreationTimestamp="2026-04-24 21:16:06 +0000 UTC" firstStartedPulling="2026-04-24 21:16:34.001347303 +0000 UTC m=+34.680183968" lastFinishedPulling="2026-04-24 21:16:46.593652535 +0000 UTC m=+47.272489200" observedRunningTime="2026-04-24 21:16:47.333612683 +0000 UTC m=+48.012449356" watchObservedRunningTime="2026-04-24 21:16:47.335984924 +0000 UTC m=+48.014821597" Apr 24 21:16:47.352827 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:47.352549 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dfdcff997-vlddk" podStartSLOduration=22.334156689 podStartE2EDuration="35.352533477s" podCreationTimestamp="2026-04-24 21:16:12 +0000 UTC" firstStartedPulling="2026-04-24 21:16:34.011389718 +0000 UTC m=+34.690226378" lastFinishedPulling="2026-04-24 21:16:47.029766501 +0000 UTC m=+47.708603166" observedRunningTime="2026-04-24 21:16:47.351613235 +0000 UTC m=+48.030449909" watchObservedRunningTime="2026-04-24 21:16:47.352533477 +0000 UTC m=+48.031370155" Apr 24 21:16:47.371887 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:47.371068 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fvtdb" podStartSLOduration=28.801382535 podStartE2EDuration="41.371054519s" podCreationTimestamp="2026-04-24 21:16:06 +0000 UTC" firstStartedPulling="2026-04-24 21:16:34.024196675 +0000 UTC m=+34.703033338" lastFinishedPulling="2026-04-24 21:16:46.593868665 +0000 UTC m=+47.272705322" observedRunningTime="2026-04-24 21:16:47.370572671 +0000 UTC m=+48.049409341" watchObservedRunningTime="2026-04-24 21:16:47.371054519 +0000 UTC m=+48.049891191" Apr 24 21:16:48.229295 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.229257 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-swmml"] Apr 24 21:16:48.232033 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.232013 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-swmml" Apr 24 21:16:48.234783 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.234565 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-94vks\"" Apr 24 21:16:48.234783 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.234744 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:48.235465 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.235339 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 21:16:48.246332 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.246311 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-swmml"] Apr 24 21:16:48.267643 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.267620 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/0.log" Apr 24 21:16:48.267769 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.267663 2581 generic.go:358] "Generic (PLEG): container finished" podID="e187095c-23db-4e09-af90-8e136f238cec" containerID="b4bb73895a9ad1de1422a65f79c71a6bedcdc725eebe53864437653151582f67" exitCode=255 Apr 24 21:16:48.267875 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.267834 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" event={"ID":"e187095c-23db-4e09-af90-8e136f238cec","Type":"ContainerDied","Data":"b4bb73895a9ad1de1422a65f79c71a6bedcdc725eebe53864437653151582f67"} Apr 24 21:16:48.269695 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.268069 2581 scope.go:117] "RemoveContainer" containerID="b4bb73895a9ad1de1422a65f79c71a6bedcdc725eebe53864437653151582f67" Apr 24 21:16:48.271463 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.271367 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z962g" event={"ID":"cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45","Type":"ContainerStarted","Data":"6760dd643817561786fe97e103fe9eca3c770fec38fa7034557c267750556587"} Apr 24 21:16:48.274305 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.274270 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" event={"ID":"d799708d-6592-4222-b0e7-a25a20dc584e","Type":"ContainerStarted","Data":"d982e5276e0e50834dfb0f2789df7ea49d778c811921ae77c5ff5103ee86b200"} Apr 24 21:16:48.275653 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.275622 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6dfdcff997-vlddk" Apr 24 21:16:48.299299 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.297903 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns942\" (UniqueName: \"kubernetes.io/projected/d68a0f58-d2e4-4a3f-a00d-2554fcccb09c-kube-api-access-ns942\") pod \"migrator-74bb7799d9-swmml\" (UID: \"d68a0f58-d2e4-4a3f-a00d-2554fcccb09c\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-swmml" Apr 24 21:16:48.312122 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.312036 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z962g" podStartSLOduration=29.652204542 podStartE2EDuration="42.312021593s" podCreationTimestamp="2026-04-24 21:16:06 +0000 UTC" firstStartedPulling="2026-04-24 21:16:34.023233776 +0000 UTC m=+34.702070430" lastFinishedPulling="2026-04-24 21:16:46.683050827 +0000 UTC m=+47.361887481" observedRunningTime="2026-04-24 21:16:48.311797236 +0000 UTC m=+48.990633909" watchObservedRunningTime="2026-04-24 21:16:48.312021593 +0000 UTC m=+48.990858266" Apr 24 21:16:48.336814 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.335564 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-nbsck"] Apr 24 21:16:48.339496 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.339144 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nbsck" Apr 24 21:16:48.345185 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.345005 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:16:48.354330 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.353691 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nbsck"] Apr 24 21:16:48.402008 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.399978 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a9f02e2a-ddce-4e3f-aec7-6bad8a006165-kubelet-config\") pod \"global-pull-secret-syncer-nbsck\" (UID: \"a9f02e2a-ddce-4e3f-aec7-6bad8a006165\") " pod="kube-system/global-pull-secret-syncer-nbsck" Apr 24 21:16:48.402008 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.400294 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a9f02e2a-ddce-4e3f-aec7-6bad8a006165-original-pull-secret\") pod \"global-pull-secret-syncer-nbsck\" (UID: \"a9f02e2a-ddce-4e3f-aec7-6bad8a006165\") " pod="kube-system/global-pull-secret-syncer-nbsck" Apr 24 21:16:48.402008 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.401333 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a9f02e2a-ddce-4e3f-aec7-6bad8a006165-dbus\") pod \"global-pull-secret-syncer-nbsck\" (UID: \"a9f02e2a-ddce-4e3f-aec7-6bad8a006165\") " pod="kube-system/global-pull-secret-syncer-nbsck" Apr 24 21:16:48.402008 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.401450 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ns942\" (UniqueName: \"kubernetes.io/projected/d68a0f58-d2e4-4a3f-a00d-2554fcccb09c-kube-api-access-ns942\") pod \"migrator-74bb7799d9-swmml\" (UID: \"d68a0f58-d2e4-4a3f-a00d-2554fcccb09c\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-swmml" Apr 24 21:16:48.414047 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.413981 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns942\" (UniqueName: \"kubernetes.io/projected/d68a0f58-d2e4-4a3f-a00d-2554fcccb09c-kube-api-access-ns942\") pod \"migrator-74bb7799d9-swmml\" (UID: \"d68a0f58-d2e4-4a3f-a00d-2554fcccb09c\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-swmml" Apr 24 21:16:48.502795 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.502719 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a9f02e2a-ddce-4e3f-aec7-6bad8a006165-original-pull-secret\") pod \"global-pull-secret-syncer-nbsck\" (UID: \"a9f02e2a-ddce-4e3f-aec7-6bad8a006165\") " pod="kube-system/global-pull-secret-syncer-nbsck" Apr 24 21:16:48.502795 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.502765 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a9f02e2a-ddce-4e3f-aec7-6bad8a006165-dbus\") pod \"global-pull-secret-syncer-nbsck\" (UID: \"a9f02e2a-ddce-4e3f-aec7-6bad8a006165\") " pod="kube-system/global-pull-secret-syncer-nbsck" Apr 24 21:16:48.502989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.502891 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a9f02e2a-ddce-4e3f-aec7-6bad8a006165-kubelet-config\") pod \"global-pull-secret-syncer-nbsck\" (UID: \"a9f02e2a-ddce-4e3f-aec7-6bad8a006165\") " pod="kube-system/global-pull-secret-syncer-nbsck" Apr 24 21:16:48.503049 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.502977 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a9f02e2a-ddce-4e3f-aec7-6bad8a006165-dbus\") pod \"global-pull-secret-syncer-nbsck\" (UID: \"a9f02e2a-ddce-4e3f-aec7-6bad8a006165\") " pod="kube-system/global-pull-secret-syncer-nbsck" Apr 24 21:16:48.503049 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.502995 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a9f02e2a-ddce-4e3f-aec7-6bad8a006165-kubelet-config\") pod \"global-pull-secret-syncer-nbsck\" (UID: \"a9f02e2a-ddce-4e3f-aec7-6bad8a006165\") " pod="kube-system/global-pull-secret-syncer-nbsck" Apr 24 21:16:48.505184 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.505151 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a9f02e2a-ddce-4e3f-aec7-6bad8a006165-original-pull-secret\") pod \"global-pull-secret-syncer-nbsck\" (UID: \"a9f02e2a-ddce-4e3f-aec7-6bad8a006165\") " pod="kube-system/global-pull-secret-syncer-nbsck" Apr 24 21:16:48.544300 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.544273 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-swmml" Apr 24 21:16:48.658282 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.657891 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nbsck" Apr 24 21:16:48.693941 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.693800 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-swmml"] Apr 24 21:16:48.698732 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:48.698697 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd68a0f58_d2e4_4a3f_a00d_2554fcccb09c.slice/crio-d71923029764b44678e51b39901eef63f5bd8503ff1e42f05f10cf8f3c8660e7 WatchSource:0}: Error finding container d71923029764b44678e51b39901eef63f5bd8503ff1e42f05f10cf8f3c8660e7: Status 404 returned error can't find the container with id d71923029764b44678e51b39901eef63f5bd8503ff1e42f05f10cf8f3c8660e7 Apr 24 21:16:48.804196 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:48.804156 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nbsck"] Apr 24 21:16:49.210729 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:49.210697 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/680befb0-2e56-4df6-b7ca-58caea84d887-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-499sv\" (UID: \"680befb0-2e56-4df6-b7ca-58caea84d887\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-499sv" Apr 24 21:16:49.210909 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:49.210830 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:16:49.210965 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:49.210911 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/680befb0-2e56-4df6-b7ca-58caea84d887-networking-console-plugin-cert podName:680befb0-2e56-4df6-b7ca-58caea84d887 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:05.210888612 +0000 UTC m=+65.889725264 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/680befb0-2e56-4df6-b7ca-58caea84d887-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-499sv" (UID: "680befb0-2e56-4df6-b7ca-58caea84d887") : secret "networking-console-plugin-cert" not found Apr 24 21:16:49.279439 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:49.279394 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/1.log" Apr 24 21:16:49.279908 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:49.279890 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/0.log" Apr 24 21:16:49.280001 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:49.279932 2581 generic.go:358] "Generic (PLEG): container finished" podID="e187095c-23db-4e09-af90-8e136f238cec" containerID="bb8204ef094654b107c2de81925d49509da8f3195b5946b147d677a7dc2010fc" exitCode=255 Apr 24 21:16:49.280064 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:49.280033 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" event={"ID":"e187095c-23db-4e09-af90-8e136f238cec","Type":"ContainerDied","Data":"bb8204ef094654b107c2de81925d49509da8f3195b5946b147d677a7dc2010fc"} Apr 24 21:16:49.280117 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:49.280088 2581 scope.go:117] "RemoveContainer" containerID="b4bb73895a9ad1de1422a65f79c71a6bedcdc725eebe53864437653151582f67" Apr 24 21:16:49.280357 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:49.280329 2581 scope.go:117] "RemoveContainer" containerID="bb8204ef094654b107c2de81925d49509da8f3195b5946b147d677a7dc2010fc" Apr 24 21:16:49.280610 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:49.280570 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-c7lrn_openshift-console-operator(e187095c-23db-4e09-af90-8e136f238cec)\"" pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" podUID="e187095c-23db-4e09-af90-8e136f238cec" Apr 24 21:16:49.281668 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:49.281644 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-swmml" event={"ID":"d68a0f58-d2e4-4a3f-a00d-2554fcccb09c","Type":"ContainerStarted","Data":"d71923029764b44678e51b39901eef63f5bd8503ff1e42f05f10cf8f3c8660e7"} Apr 24 21:16:49.311637 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:49.311596 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0a690ce8-242f-4216-9ac1-7a4d0f94784b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k7q4x\" (UID: \"0a690ce8-242f-4216-9ac1-7a4d0f94784b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7q4x" Apr 24 21:16:49.311782 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:49.311666 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:16:49.311782 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:49.311759 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:16:49.311882 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:49.311759 2581 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:16:49.311882 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:49.311845 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:49.311882 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:49.311859 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c69778f6c-6k4sq: secret "image-registry-tls" not found Apr 24 21:16:49.312009 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:49.311816 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:49.312009 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:49.311896 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a690ce8-242f-4216-9ac1-7a4d0f94784b-samples-operator-tls podName:0a690ce8-242f-4216-9ac1-7a4d0f94784b nodeName:}" failed. No retries permitted until 2026-04-24 21:17:05.311877397 +0000 UTC m=+65.990714053 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0a690ce8-242f-4216-9ac1-7a4d0f94784b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-k7q4x" (UID: "0a690ce8-242f-4216-9ac1-7a4d0f94784b") : secret "samples-operator-tls" not found Apr 24 21:16:49.312009 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:49.311899 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68586bbdd8-8kw45: secret "image-registry-tls" not found Apr 24 21:16:49.312009 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:49.311913 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls podName:ff062d15-1ff3-4d8b-92be-3341e5f59abb nodeName:}" failed. No retries permitted until 2026-04-24 21:17:05.311903598 +0000 UTC m=+65.990740251 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls") pod "image-registry-6c69778f6c-6k4sq" (UID: "ff062d15-1ff3-4d8b-92be-3341e5f59abb") : secret "image-registry-tls" not found Apr 24 21:16:49.312009 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:49.311936 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls podName:d77d521f-bac2-47f1-80bd-1a4c7f08c799 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:05.311918115 +0000 UTC m=+65.990754772 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls") pod "image-registry-68586bbdd8-8kw45" (UID: "d77d521f-bac2-47f1-80bd-1a4c7f08c799") : secret "image-registry-tls" not found Apr 24 21:16:49.370834 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:49.370803 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9f02e2a_ddce_4e3f_aec7_6bad8a006165.slice/crio-241aed53b769f936de26a6321cf896c26c992b79b2b77ab980f8c9c97a5fb512 WatchSource:0}: Error finding container 241aed53b769f936de26a6321cf896c26c992b79b2b77ab980f8c9c97a5fb512: Status 404 returned error can't find the container with id 241aed53b769f936de26a6321cf896c26c992b79b2b77ab980f8c9c97a5fb512 Apr 24 21:16:49.412783 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:49.412762 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-metrics-tls\") pod \"dns-default-w76jt\" (UID: \"d79fb269-2ec1-4e09-a8d6-9b2a367d21b6\") " pod="openshift-dns/dns-default-w76jt" Apr 24 21:16:49.413442 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:49.412910 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:49.413442 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:49.412927 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4c82719-9c98-4a75-864d-75fb12509cb1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lhgp5\" (UID: \"a4c82719-9c98-4a75-864d-75fb12509cb1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5" Apr 24 21:16:49.413442 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:49.412966 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-metrics-tls podName:d79fb269-2ec1-4e09-a8d6-9b2a367d21b6 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:05.412949055 +0000 UTC m=+66.091785709 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-metrics-tls") pod "dns-default-w76jt" (UID: "d79fb269-2ec1-4e09-a8d6-9b2a367d21b6") : secret "dns-default-metrics-tls" not found Apr 24 21:16:49.413442 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:49.413008 2581 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:49.413442 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:49.413028 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-metrics-certs\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:49.413442 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:49.413042 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4c82719-9c98-4a75-864d-75fb12509cb1-cluster-monitoring-operator-tls podName:a4c82719-9c98-4a75-864d-75fb12509cb1 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:05.413030981 +0000 UTC m=+66.091867638 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a4c82719-9c98-4a75-864d-75fb12509cb1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lhgp5" (UID: "a4c82719-9c98-4a75-864d-75fb12509cb1") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:16:49.413442 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:49.413085 2581 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:16:49.413442 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:49.413105 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/543220ca-e10a-465a-a5ee-a24026536361-service-ca-bundle\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:16:49.413442 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:49.413114 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-metrics-certs podName:543220ca-e10a-465a-a5ee-a24026536361 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:05.413104052 +0000 UTC m=+66.091940717 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-metrics-certs") pod "router-default-5fdf56dbd-s82zg" (UID: "543220ca-e10a-465a-a5ee-a24026536361") : secret "router-metrics-certs-default" not found Apr 24 21:16:49.413442 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:49.413170 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/543220ca-e10a-465a-a5ee-a24026536361-service-ca-bundle podName:543220ca-e10a-465a-a5ee-a24026536361 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:05.413160781 +0000 UTC m=+66.091997434 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/543220ca-e10a-465a-a5ee-a24026536361-service-ca-bundle") pod "router-default-5fdf56dbd-s82zg" (UID: "543220ca-e10a-465a-a5ee-a24026536361") : configmap references non-existent config key: service-ca.crt Apr 24 21:16:49.413442 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:49.413223 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aefd5dcc-7f58-4fab-8028-2cffcff95339-cert\") pod \"ingress-canary-pxf27\" (UID: \"aefd5dcc-7f58-4fab-8028-2cffcff95339\") " pod="openshift-ingress-canary/ingress-canary-pxf27" Apr 24 21:16:49.413442 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:49.413397 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:49.413870 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:49.413474 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aefd5dcc-7f58-4fab-8028-2cffcff95339-cert podName:aefd5dcc-7f58-4fab-8028-2cffcff95339 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:05.413456647 +0000 UTC m=+66.092293297 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aefd5dcc-7f58-4fab-8028-2cffcff95339-cert") pod "ingress-canary-pxf27" (UID: "aefd5dcc-7f58-4fab-8028-2cffcff95339") : secret "canary-serving-cert" not found Apr 24 21:16:49.958200 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:49.958123 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2mqlt"] Apr 24 21:16:49.960825 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:49.960764 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2mqlt" Apr 24 21:16:49.963534 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:49.963507 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:16:49.963814 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:49.963792 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:16:49.963814 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:49.963807 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bqlv4\"" Apr 24 21:16:49.972903 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:49.972863 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2mqlt"] Apr 24 21:16:50.018621 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:50.018592 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2mqlt\" (UID: \"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9\") " pod="openshift-insights/insights-runtime-extractor-2mqlt" Apr 24 21:16:50.018769 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:50.018624 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-crio-socket\") pod \"insights-runtime-extractor-2mqlt\" (UID: \"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9\") " pod="openshift-insights/insights-runtime-extractor-2mqlt" Apr 24 21:16:50.018836 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:50.018803 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjlkv\" (UniqueName: \"kubernetes.io/projected/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-kube-api-access-qjlkv\") pod \"insights-runtime-extractor-2mqlt\" (UID: \"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9\") " pod="openshift-insights/insights-runtime-extractor-2mqlt" Apr 24 21:16:50.018887 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:50.018854 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-data-volume\") pod \"insights-runtime-extractor-2mqlt\" (UID: \"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9\") " pod="openshift-insights/insights-runtime-extractor-2mqlt" Apr 24 21:16:50.018963 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:50.018942 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2mqlt\" (UID: \"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9\") " pod="openshift-insights/insights-runtime-extractor-2mqlt" Apr 24 21:16:50.119748 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:50.119713 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjlkv\" (UniqueName: \"kubernetes.io/projected/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-kube-api-access-qjlkv\") pod \"insights-runtime-extractor-2mqlt\" (UID: \"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9\") " pod="openshift-insights/insights-runtime-extractor-2mqlt" Apr 24 21:16:50.119907 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:50.119758 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-data-volume\") pod \"insights-runtime-extractor-2mqlt\" (UID: \"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9\") " pod="openshift-insights/insights-runtime-extractor-2mqlt" Apr 24 21:16:50.119907 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:50.119809 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2mqlt\" (UID: \"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9\") " pod="openshift-insights/insights-runtime-extractor-2mqlt" Apr 24 21:16:50.119907 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:50.119866 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2mqlt\" (UID: \"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9\") " pod="openshift-insights/insights-runtime-extractor-2mqlt" Apr 24 21:16:50.119907 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:50.119896 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-crio-socket\") pod \"insights-runtime-extractor-2mqlt\" (UID: \"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9\") " pod="openshift-insights/insights-runtime-extractor-2mqlt" Apr 24 21:16:50.120108 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:50.119984 2581 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:16:50.120108 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:50.120023 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-crio-socket\") pod \"insights-runtime-extractor-2mqlt\" (UID: \"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9\") " pod="openshift-insights/insights-runtime-extractor-2mqlt" Apr 24 21:16:50.120108 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:50.120045 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-insights-runtime-extractor-tls podName:ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:50.620026329 +0000 UTC m=+51.298862991 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-insights-runtime-extractor-tls") pod "insights-runtime-extractor-2mqlt" (UID: "ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9") : secret "insights-runtime-extractor-tls" not found Apr 24 21:16:50.120262 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:50.120173 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-data-volume\") pod \"insights-runtime-extractor-2mqlt\" (UID: \"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9\") " pod="openshift-insights/insights-runtime-extractor-2mqlt" Apr 24 21:16:50.120614 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:50.120586 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2mqlt\" (UID: \"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9\") " pod="openshift-insights/insights-runtime-extractor-2mqlt" Apr 24 21:16:50.132606 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:50.132581 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjlkv\" (UniqueName: \"kubernetes.io/projected/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-kube-api-access-qjlkv\") pod \"insights-runtime-extractor-2mqlt\" (UID: \"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9\") " pod="openshift-insights/insights-runtime-extractor-2mqlt" Apr 24 21:16:50.286457 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:50.286371 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/1.log" Apr 24 21:16:50.286849 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:50.286822 2581 scope.go:117] "RemoveContainer" containerID="bb8204ef094654b107c2de81925d49509da8f3195b5946b147d677a7dc2010fc" Apr 24 21:16:50.287059 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:50.287030 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-c7lrn_openshift-console-operator(e187095c-23db-4e09-af90-8e136f238cec)\"" pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" podUID="e187095c-23db-4e09-af90-8e136f238cec" Apr 24 21:16:50.288000 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:50.287971 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nbsck" event={"ID":"a9f02e2a-ddce-4e3f-aec7-6bad8a006165","Type":"ContainerStarted","Data":"241aed53b769f936de26a6321cf896c26c992b79b2b77ab980f8c9c97a5fb512"} Apr 24 21:16:50.291281 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:50.291257 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" event={"ID":"d799708d-6592-4222-b0e7-a25a20dc584e","Type":"ContainerStarted","Data":"4e669f475ba2ca6610fc0d74cf3edd5550ad9e46f9f0e0799799f0f5539e3e46"} Apr 24 21:16:50.291370 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:50.291291 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" event={"ID":"d799708d-6592-4222-b0e7-a25a20dc584e","Type":"ContainerStarted","Data":"6c1a95d2658fb4a87e6d855624b8c79b44efec3e40530d5767c0dd6a47bc2d5a"} Apr 24 21:16:50.326243 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:50.325514 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" podStartSLOduration=22.635856477 podStartE2EDuration="38.325499534s" podCreationTimestamp="2026-04-24 21:16:12 +0000 UTC" firstStartedPulling="2026-04-24 21:16:33.716495308 +0000 UTC m=+34.395331959" lastFinishedPulling="2026-04-24 21:16:49.406138364 +0000 UTC m=+50.084975016" observedRunningTime="2026-04-24 21:16:50.324135873 +0000 UTC m=+51.002972547" watchObservedRunningTime="2026-04-24 21:16:50.325499534 +0000 UTC m=+51.004336207" Apr 24 21:16:50.625953 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:50.625315 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2mqlt\" (UID: \"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9\") " pod="openshift-insights/insights-runtime-extractor-2mqlt" Apr 24 21:16:50.625953 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:50.625578 2581 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:16:50.625953 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:50.625638 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-insights-runtime-extractor-tls podName:ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:51.625619076 +0000 UTC m=+52.304455732 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-insights-runtime-extractor-tls") pod "insights-runtime-extractor-2mqlt" (UID: "ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9") : secret "insights-runtime-extractor-tls" not found Apr 24 21:16:50.868935 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:50.868851 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5pvk7_92521ad1-7ba9-4bdd-bc3b-f470cd17cfef/dns-node-resolver/0.log" Apr 24 21:16:51.043046 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.043012 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-d5swp"] Apr 24 21:16:51.045362 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.045345 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-d5swp" Apr 24 21:16:51.048622 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.048598 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 21:16:51.048622 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.048613 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 21:16:51.048792 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.048773 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 21:16:51.048871 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.048850 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-dqgrw\"" Apr 24 21:16:51.048972 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.048907 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 21:16:51.055216 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.055196 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-d5swp"] Apr 24 21:16:51.131332 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.131260 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/16c78151-24a8-4713-8c08-ddf9b96dbb46-signing-cabundle\") pod \"service-ca-865cb79987-d5swp\" (UID: \"16c78151-24a8-4713-8c08-ddf9b96dbb46\") " pod="openshift-service-ca/service-ca-865cb79987-d5swp" Apr 24 21:16:51.131487 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.131397 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/16c78151-24a8-4713-8c08-ddf9b96dbb46-signing-key\") pod \"service-ca-865cb79987-d5swp\" (UID: \"16c78151-24a8-4713-8c08-ddf9b96dbb46\") " pod="openshift-service-ca/service-ca-865cb79987-d5swp" Apr 24 21:16:51.131545 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.131498 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwv8k\" (UniqueName: \"kubernetes.io/projected/16c78151-24a8-4713-8c08-ddf9b96dbb46-kube-api-access-zwv8k\") pod \"service-ca-865cb79987-d5swp\" (UID: \"16c78151-24a8-4713-8c08-ddf9b96dbb46\") " pod="openshift-service-ca/service-ca-865cb79987-d5swp" Apr 24 21:16:51.232360 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.232325 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/16c78151-24a8-4713-8c08-ddf9b96dbb46-signing-cabundle\") pod \"service-ca-865cb79987-d5swp\" (UID: \"16c78151-24a8-4713-8c08-ddf9b96dbb46\") " pod="openshift-service-ca/service-ca-865cb79987-d5swp" Apr 24 21:16:51.232531 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.232514 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/16c78151-24a8-4713-8c08-ddf9b96dbb46-signing-key\") pod \"service-ca-865cb79987-d5swp\" (UID: \"16c78151-24a8-4713-8c08-ddf9b96dbb46\") " pod="openshift-service-ca/service-ca-865cb79987-d5swp" Apr 24 21:16:51.232599 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.232584 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwv8k\" (UniqueName: \"kubernetes.io/projected/16c78151-24a8-4713-8c08-ddf9b96dbb46-kube-api-access-zwv8k\") pod \"service-ca-865cb79987-d5swp\" (UID: \"16c78151-24a8-4713-8c08-ddf9b96dbb46\") " pod="openshift-service-ca/service-ca-865cb79987-d5swp" Apr 24 21:16:51.232989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.232961 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/16c78151-24a8-4713-8c08-ddf9b96dbb46-signing-cabundle\") pod \"service-ca-865cb79987-d5swp\" (UID: \"16c78151-24a8-4713-8c08-ddf9b96dbb46\") " pod="openshift-service-ca/service-ca-865cb79987-d5swp" Apr 24 21:16:51.235061 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.235034 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/16c78151-24a8-4713-8c08-ddf9b96dbb46-signing-key\") pod \"service-ca-865cb79987-d5swp\" (UID: \"16c78151-24a8-4713-8c08-ddf9b96dbb46\") " pod="openshift-service-ca/service-ca-865cb79987-d5swp" Apr 24 21:16:51.242674 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.242651 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwv8k\" (UniqueName: \"kubernetes.io/projected/16c78151-24a8-4713-8c08-ddf9b96dbb46-kube-api-access-zwv8k\") pod \"service-ca-865cb79987-d5swp\" (UID: \"16c78151-24a8-4713-8c08-ddf9b96dbb46\") " pod="openshift-service-ca/service-ca-865cb79987-d5swp" Apr 24 21:16:51.295942 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.295903 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-swmml" event={"ID":"d68a0f58-d2e4-4a3f-a00d-2554fcccb09c","Type":"ContainerStarted","Data":"b08491e78291fc6e27e6bcea04a49e3b8d4563562d1760a832eb0053a9396df8"} Apr 24 21:16:51.296312 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.295955 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-swmml" event={"ID":"d68a0f58-d2e4-4a3f-a00d-2554fcccb09c","Type":"ContainerStarted","Data":"9fd9436f2bce0dcf25e2f36550d490b98781074a050ebf6d57bce18ec028fc82"} Apr 24 21:16:51.311856 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.311811 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-swmml" podStartSLOduration=1.4700799390000001 podStartE2EDuration="3.311798904s" podCreationTimestamp="2026-04-24 21:16:48 +0000 UTC" firstStartedPulling="2026-04-24 21:16:48.701200646 +0000 UTC m=+49.380037295" lastFinishedPulling="2026-04-24 21:16:50.542919609 +0000 UTC m=+51.221756260" observedRunningTime="2026-04-24 21:16:51.310951345 +0000 UTC m=+51.989788018" watchObservedRunningTime="2026-04-24 21:16:51.311798904 +0000 UTC m=+51.990635599" Apr 24 21:16:51.354496 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.354467 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-d5swp" Apr 24 21:16:51.483472 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.483447 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-d5swp"] Apr 24 21:16:51.485815 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:16:51.485780 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16c78151_24a8_4713_8c08_ddf9b96dbb46.slice/crio-0a13021bb7278f59019b53101bfc49d8b2fc8f6ee5e7c10ac28485a370a47b63 WatchSource:0}: Error finding container 0a13021bb7278f59019b53101bfc49d8b2fc8f6ee5e7c10ac28485a370a47b63: Status 404 returned error can't find the container with id 0a13021bb7278f59019b53101bfc49d8b2fc8f6ee5e7c10ac28485a370a47b63 Apr 24 21:16:51.636799 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.636749 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2mqlt\" (UID: \"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9\") " pod="openshift-insights/insights-runtime-extractor-2mqlt" Apr 24 21:16:51.636974 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:51.636927 2581 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:16:51.637031 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:51.636995 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-insights-runtime-extractor-tls podName:ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:53.636977236 +0000 UTC m=+54.315813887 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-insights-runtime-extractor-tls") pod "insights-runtime-extractor-2mqlt" (UID: "ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9") : secret "insights-runtime-extractor-tls" not found Apr 24 21:16:51.866257 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:51.866216 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nwpfl_801b2a6e-b16d-4e63-b007-af7d6c9273f5/node-ca/0.log" Apr 24 21:16:52.301046 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:52.300994 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-d5swp" event={"ID":"16c78151-24a8-4713-8c08-ddf9b96dbb46","Type":"ContainerStarted","Data":"1ebda6413895f82120e5dfc5e49bf8ad56040a49806f4458b49b0c9a7945029b"} Apr 24 21:16:52.301046 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:52.301045 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-d5swp" event={"ID":"16c78151-24a8-4713-8c08-ddf9b96dbb46","Type":"ContainerStarted","Data":"0a13021bb7278f59019b53101bfc49d8b2fc8f6ee5e7c10ac28485a370a47b63"} Apr 24 21:16:52.324400 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:52.324348 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-d5swp" podStartSLOduration=1.324329742 podStartE2EDuration="1.324329742s" podCreationTimestamp="2026-04-24 21:16:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:16:52.323018795 +0000 UTC m=+53.001855467" watchObservedRunningTime="2026-04-24 21:16:52.324329742 +0000 UTC m=+53.003166412" Apr 24 21:16:52.467755 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:52.467702 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-swmml_d68a0f58-d2e4-4a3f-a00d-2554fcccb09c/migrator/0.log" Apr 24 21:16:52.673889 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:52.673828 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-swmml_d68a0f58-d2e4-4a3f-a00d-2554fcccb09c/graceful-termination/0.log" Apr 24 21:16:52.875056 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:52.875021 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-z962g_cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45/kube-storage-version-migrator-operator/0.log" Apr 24 21:16:53.555885 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:53.554843 2581 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" Apr 24 21:16:53.555885 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:53.554884 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" Apr 24 21:16:53.555885 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:53.555309 2581 scope.go:117] "RemoveContainer" containerID="bb8204ef094654b107c2de81925d49509da8f3195b5946b147d677a7dc2010fc" Apr 24 21:16:53.555885 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:53.555538 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-c7lrn_openshift-console-operator(e187095c-23db-4e09-af90-8e136f238cec)\"" pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" podUID="e187095c-23db-4e09-af90-8e136f238cec" Apr 24 21:16:53.659445 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:53.658536 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2mqlt\" (UID: \"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9\") " pod="openshift-insights/insights-runtime-extractor-2mqlt" Apr 24 21:16:53.659445 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:53.659415 2581 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:16:53.659607 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:53.659493 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-insights-runtime-extractor-tls podName:ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:57.659473771 +0000 UTC m=+58.338310422 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-insights-runtime-extractor-tls") pod "insights-runtime-extractor-2mqlt" (UID: "ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9") : secret "insights-runtime-extractor-tls" not found Apr 24 21:16:54.307257 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:54.307226 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nbsck" event={"ID":"a9f02e2a-ddce-4e3f-aec7-6bad8a006165","Type":"ContainerStarted","Data":"c84951003aa0a01cf296bc6237aeb15ed91aade177617fa87d93a5e0d3b99927"} Apr 24 21:16:54.333923 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:54.333881 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-nbsck" podStartSLOduration=2.101974305 podStartE2EDuration="6.333868152s" podCreationTimestamp="2026-04-24 21:16:48 +0000 UTC" firstStartedPulling="2026-04-24 21:16:49.373018528 +0000 UTC m=+50.051855190" lastFinishedPulling="2026-04-24 21:16:53.604912374 +0000 UTC m=+54.283749037" observedRunningTime="2026-04-24 21:16:54.330437562 +0000 UTC m=+55.009274225" watchObservedRunningTime="2026-04-24 21:16:54.333868152 +0000 UTC m=+55.012704823" Apr 24 21:16:57.696959 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:57.696906 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2mqlt\" (UID: \"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9\") " pod="openshift-insights/insights-runtime-extractor-2mqlt" Apr 24 21:16:57.697481 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:57.697061 2581 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:16:57.697481 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:16:57.697127 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-insights-runtime-extractor-tls podName:ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:05.69711214 +0000 UTC m=+66.375948790 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-insights-runtime-extractor-tls") pod "insights-runtime-extractor-2mqlt" (UID: "ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9") : secret "insights-runtime-extractor-tls" not found Apr 24 21:16:59.026312 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:16:59.026284 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qnlsv" Apr 24 21:17:05.268165 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.268118 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/680befb0-2e56-4df6-b7ca-58caea84d887-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-499sv\" (UID: \"680befb0-2e56-4df6-b7ca-58caea84d887\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-499sv" Apr 24 21:17:05.270514 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.270494 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/680befb0-2e56-4df6-b7ca-58caea84d887-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-499sv\" (UID: \"680befb0-2e56-4df6-b7ca-58caea84d887\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-499sv" Apr 24 21:17:05.369121 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.369093 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:17:05.369269 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.369156 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:17:05.369310 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.369301 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0a690ce8-242f-4216-9ac1-7a4d0f94784b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k7q4x\" (UID: \"0a690ce8-242f-4216-9ac1-7a4d0f94784b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7q4x" Apr 24 21:17:05.371876 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.371850 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls\") pod \"image-registry-6c69778f6c-6k4sq\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:17:05.372021 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.372000 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0a690ce8-242f-4216-9ac1-7a4d0f94784b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k7q4x\" (UID: \"0a690ce8-242f-4216-9ac1-7a4d0f94784b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7q4x" Apr 24 21:17:05.372068 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.372000 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls\") pod \"image-registry-68586bbdd8-8kw45\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:17:05.374333 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.374316 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-tjpsp\"" Apr 24 21:17:05.382178 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.382159 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-499sv" Apr 24 21:17:05.470561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.470164 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4c82719-9c98-4a75-864d-75fb12509cb1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lhgp5\" (UID: \"a4c82719-9c98-4a75-864d-75fb12509cb1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5" Apr 24 21:17:05.470561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.470241 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-metrics-certs\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:17:05.470561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.470284 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/543220ca-e10a-465a-a5ee-a24026536361-service-ca-bundle\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:17:05.470561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.470323 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aefd5dcc-7f58-4fab-8028-2cffcff95339-cert\") pod \"ingress-canary-pxf27\" (UID: \"aefd5dcc-7f58-4fab-8028-2cffcff95339\") " pod="openshift-ingress-canary/ingress-canary-pxf27" Apr 24 21:17:05.470561 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.470367 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-metrics-tls\") pod \"dns-default-w76jt\" (UID: \"d79fb269-2ec1-4e09-a8d6-9b2a367d21b6\") " pod="openshift-dns/dns-default-w76jt" Apr 24 21:17:05.471515 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.471484 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/543220ca-e10a-465a-a5ee-a24026536361-service-ca-bundle\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:17:05.473187 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.473161 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4c82719-9c98-4a75-864d-75fb12509cb1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lhgp5\" (UID: \"a4c82719-9c98-4a75-864d-75fb12509cb1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5" Apr 24 21:17:05.473610 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.473586 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d79fb269-2ec1-4e09-a8d6-9b2a367d21b6-metrics-tls\") pod \"dns-default-w76jt\" (UID: \"d79fb269-2ec1-4e09-a8d6-9b2a367d21b6\") " pod="openshift-dns/dns-default-w76jt" Apr 24 21:17:05.473701 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.473586 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/543220ca-e10a-465a-a5ee-a24026536361-metrics-certs\") pod \"router-default-5fdf56dbd-s82zg\" (UID: \"543220ca-e10a-465a-a5ee-a24026536361\") " pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:17:05.473764 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.473747 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aefd5dcc-7f58-4fab-8028-2cffcff95339-cert\") pod \"ingress-canary-pxf27\" (UID: \"aefd5dcc-7f58-4fab-8028-2cffcff95339\") " pod="openshift-ingress-canary/ingress-canary-pxf27" Apr 24 21:17:05.497355 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.497337 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fbr4t\"" Apr 24 21:17:05.505310 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.505288 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:17:05.506173 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.506156 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-499sv"] Apr 24 21:17:05.511278 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:17:05.511255 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod680befb0_2e56_4df6_b7ca_58caea84d887.slice/crio-6b282d644aa14ff11fd9707607d61fb6e443fe140a99422040e02d9c39409201 WatchSource:0}: Error finding container 6b282d644aa14ff11fd9707607d61fb6e443fe140a99422040e02d9c39409201: Status 404 returned error can't find the container with id 6b282d644aa14ff11fd9707607d61fb6e443fe140a99422040e02d9c39409201 Apr 24 21:17:05.554334 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.554308 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:17:05.563378 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.563357 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-vjpgj\"" Apr 24 21:17:05.567669 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.567650 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-55vb9\"" Apr 24 21:17:05.570732 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.570711 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7q4x" Apr 24 21:17:05.576552 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.576474 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5" Apr 24 21:17:05.593219 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.592997 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-qjdj8\"" Apr 24 21:17:05.600551 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.600437 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:17:05.604379 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.604155 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rt6g8\"" Apr 24 21:17:05.612012 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.611605 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pxf27" Apr 24 21:17:05.616768 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.616735 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vh4z7\"" Apr 24 21:17:05.626184 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.625290 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w76jt" Apr 24 21:17:05.640354 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.640050 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-68586bbdd8-8kw45"] Apr 24 21:17:05.672279 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.671888 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs\") pod \"network-metrics-daemon-n487x\" (UID: \"657a2c9b-4e75-4d61-bff2-d8abdd05825d\") " pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:17:05.674811 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.674673 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:17:05.690223 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.689801 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/657a2c9b-4e75-4d61-bff2-d8abdd05825d-metrics-certs\") pod \"network-metrics-daemon-n487x\" (UID: \"657a2c9b-4e75-4d61-bff2-d8abdd05825d\") " pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:17:05.736052 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.730019 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6c69778f6c-6k4sq"] Apr 24 21:17:05.776664 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.776540 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2mqlt\" (UID: \"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9\") " pod="openshift-insights/insights-runtime-extractor-2mqlt" Apr 24 21:17:05.779884 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.779829 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2mqlt\" (UID: \"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9\") " pod="openshift-insights/insights-runtime-extractor-2mqlt" Apr 24 21:17:05.797121 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.797090 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7q4x"] Apr 24 21:17:05.809249 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.809223 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-8v42p\"" Apr 24 21:17:05.816607 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.816574 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n487x" Apr 24 21:17:05.837904 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.837787 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5"] Apr 24 21:17:05.849006 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.848960 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5fdf56dbd-s82zg"] Apr 24 21:17:05.856468 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:17:05.856394 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod543220ca_e10a_465a_a5ee_a24026536361.slice/crio-50ac7496d799a7c6fdbb4459b6d4343aaf9f8b9b3eded4d292ba4198b37bad74 WatchSource:0}: Error finding container 50ac7496d799a7c6fdbb4459b6d4343aaf9f8b9b3eded4d292ba4198b37bad74: Status 404 returned error can't find the container with id 50ac7496d799a7c6fdbb4459b6d4343aaf9f8b9b3eded4d292ba4198b37bad74 Apr 24 21:17:05.869951 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.869921 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pxf27"] Apr 24 21:17:05.876221 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:17:05.875457 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaefd5dcc_7f58_4fab_8028_2cffcff95339.slice/crio-28fd813d02b52a14cf3b72d4b594b905b831aaa1921c3a7c0867c67bfc320ecf WatchSource:0}: Error finding container 28fd813d02b52a14cf3b72d4b594b905b831aaa1921c3a7c0867c67bfc320ecf: Status 404 returned error can't find the container with id 28fd813d02b52a14cf3b72d4b594b905b831aaa1921c3a7c0867c67bfc320ecf Apr 24 21:17:05.876221 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.875592 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bqlv4\"" Apr 24 21:17:05.876910 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.876885 2581 scope.go:117] "RemoveContainer" containerID="bb8204ef094654b107c2de81925d49509da8f3195b5946b147d677a7dc2010fc" Apr 24 21:17:05.884553 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.884185 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2mqlt" Apr 24 21:17:05.889617 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:05.889592 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w76jt"] Apr 24 21:17:06.003143 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.003107 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n487x"] Apr 24 21:17:06.064720 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.064642 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2mqlt"] Apr 24 21:17:06.068464 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:17:06.068416 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec5c2f4e_e1d8_48e1_bcc0_bf72afefb8e9.slice/crio-9de35aefd9e9297af4446aa2477d1320c4a4e113d8e0ff2bcee9f7e14208da71 WatchSource:0}: Error finding container 9de35aefd9e9297af4446aa2477d1320c4a4e113d8e0ff2bcee9f7e14208da71: Status 404 returned error can't find the container with id 9de35aefd9e9297af4446aa2477d1320c4a4e113d8e0ff2bcee9f7e14208da71 Apr 24 21:17:06.345818 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.345632 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5" event={"ID":"a4c82719-9c98-4a75-864d-75fb12509cb1","Type":"ContainerStarted","Data":"6106056c9d560fddcb13724a452358eb46e6dd47ff4a026089256726540ae4ca"} Apr 24 21:17:06.347972 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.347946 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/1.log" Apr 24 21:17:06.348096 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.348040 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" event={"ID":"e187095c-23db-4e09-af90-8e136f238cec","Type":"ContainerStarted","Data":"9ae199ec078f38b8672ea664c2a569424f4d8ccdedc48544749babdffa4c8b72"} Apr 24 21:17:06.348590 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.348556 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" Apr 24 21:17:06.350746 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.350720 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" event={"ID":"d77d521f-bac2-47f1-80bd-1a4c7f08c799","Type":"ContainerStarted","Data":"7aacf6cc3bab5ea4815f4df95840524fbc596fd149e8e3807316eeeaf82a53a9"} Apr 24 21:17:06.350853 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.350753 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" event={"ID":"d77d521f-bac2-47f1-80bd-1a4c7f08c799","Type":"ContainerStarted","Data":"f4e8e8e1642fb6c2e459b31fd750452fa6b8f2627f7ce4f4c63005f60ab2d895"} Apr 24 21:17:06.351154 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.351108 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:17:06.353966 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.353612 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5fdf56dbd-s82zg" event={"ID":"543220ca-e10a-465a-a5ee-a24026536361","Type":"ContainerStarted","Data":"e194010ef0fe1ae3f2dcfe7550067ed526bb2cc14895790b7db0229faed1569d"} Apr 24 21:17:06.353966 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.353642 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5fdf56dbd-s82zg" event={"ID":"543220ca-e10a-465a-a5ee-a24026536361","Type":"ContainerStarted","Data":"50ac7496d799a7c6fdbb4459b6d4343aaf9f8b9b3eded4d292ba4198b37bad74"} Apr 24 21:17:06.355475 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.355438 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w76jt" event={"ID":"d79fb269-2ec1-4e09-a8d6-9b2a367d21b6","Type":"ContainerStarted","Data":"1d2e7fc69fbe9edbb364be2519647cce3542896b5ff569db0f2eac28bcf4609e"} Apr 24 21:17:06.356537 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.356514 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7q4x" event={"ID":"0a690ce8-242f-4216-9ac1-7a4d0f94784b","Type":"ContainerStarted","Data":"36b26cd7329d5f722bc1de4c9def9b376aff2db83ab9eec77da529ff59fb957a"} Apr 24 21:17:06.358030 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.357984 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2mqlt" event={"ID":"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9","Type":"ContainerStarted","Data":"dea43044dec89af0ef989d67a67d998f3d25a9dd85b8a76d01919ac50ff37033"} Apr 24 21:17:06.358030 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.358012 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2mqlt" event={"ID":"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9","Type":"ContainerStarted","Data":"9de35aefd9e9297af4446aa2477d1320c4a4e113d8e0ff2bcee9f7e14208da71"} Apr 24 21:17:06.359157 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.359124 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n487x" event={"ID":"657a2c9b-4e75-4d61-bff2-d8abdd05825d","Type":"ContainerStarted","Data":"367f3fdd3b45ddaaed3c5813fb2d5aea88e4f5cc31c6f6743583961228b3b83a"} Apr 24 21:17:06.360224 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.360201 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pxf27" event={"ID":"aefd5dcc-7f58-4fab-8028-2cffcff95339","Type":"ContainerStarted","Data":"28fd813d02b52a14cf3b72d4b594b905b831aaa1921c3a7c0867c67bfc320ecf"} Apr 24 21:17:06.361579 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.361543 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-499sv" event={"ID":"680befb0-2e56-4df6-b7ca-58caea84d887","Type":"ContainerStarted","Data":"6b282d644aa14ff11fd9707607d61fb6e443fe140a99422040e02d9c39409201"} Apr 24 21:17:06.363805 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.363782 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" event={"ID":"ff062d15-1ff3-4d8b-92be-3341e5f59abb","Type":"ContainerStarted","Data":"f6b26feb5555fdf68823f814316d1522177c787f78a7d8b0d6a3a262674ac958"} Apr 24 21:17:06.363901 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.363824 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" event={"ID":"ff062d15-1ff3-4d8b-92be-3341e5f59abb","Type":"ContainerStarted","Data":"4292814f213fe124b26a1670d0ffb582b63d9eb1e97c03de7ebc2d11defddc38"} Apr 24 21:17:06.363953 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.363923 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:17:06.369818 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.369784 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" podStartSLOduration=47.587572876 podStartE2EDuration="1m0.369772686s" podCreationTimestamp="2026-04-24 21:16:06 +0000 UTC" firstStartedPulling="2026-04-24 21:16:33.71707378 +0000 UTC m=+34.395910430" lastFinishedPulling="2026-04-24 21:16:46.499273573 +0000 UTC m=+47.178110240" observedRunningTime="2026-04-24 21:17:06.36801729 +0000 UTC m=+67.046853973" watchObservedRunningTime="2026-04-24 21:17:06.369772686 +0000 UTC m=+67.048609357" Apr 24 21:17:06.389437 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.389369 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" podStartSLOduration=66.389350855 podStartE2EDuration="1m6.389350855s" podCreationTimestamp="2026-04-24 21:16:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:17:06.388843183 +0000 UTC m=+67.067679855" watchObservedRunningTime="2026-04-24 21:17:06.389350855 +0000 UTC m=+67.068187529" Apr 24 21:17:06.410506 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.410451 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5fdf56dbd-s82zg" podStartSLOduration=60.410413498 podStartE2EDuration="1m0.410413498s" podCreationTimestamp="2026-04-24 21:16:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:17:06.409864086 +0000 UTC m=+67.088700810" watchObservedRunningTime="2026-04-24 21:17:06.410413498 +0000 UTC m=+67.089250172" Apr 24 21:17:06.433753 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.433714 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" podStartSLOduration=60.433700576 podStartE2EDuration="1m0.433700576s" podCreationTimestamp="2026-04-24 21:16:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:17:06.432956777 +0000 UTC m=+67.111793450" watchObservedRunningTime="2026-04-24 21:17:06.433700576 +0000 UTC m=+67.112537248" Apr 24 21:17:06.601049 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.600965 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:17:06.603882 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.603858 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:17:06.945774 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:06.945650 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-c7lrn" Apr 24 21:17:07.376018 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:07.375590 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-499sv" event={"ID":"680befb0-2e56-4df6-b7ca-58caea84d887","Type":"ContainerStarted","Data":"cb9ac0f3eed90a36cd520b1434a7505315b129ea43ed5840bd50c4578b31e549"} Apr 24 21:17:07.377100 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:07.376610 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:17:07.378202 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:07.378118 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5fdf56dbd-s82zg" Apr 24 21:17:07.416531 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:07.416479 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-499sv" podStartSLOduration=51.146489435 podStartE2EDuration="52.41645905s" podCreationTimestamp="2026-04-24 21:16:15 +0000 UTC" firstStartedPulling="2026-04-24 21:17:05.513034648 +0000 UTC m=+66.191871302" lastFinishedPulling="2026-04-24 21:17:06.783004253 +0000 UTC m=+67.461840917" observedRunningTime="2026-04-24 21:17:07.393169718 +0000 UTC m=+68.072006390" watchObservedRunningTime="2026-04-24 21:17:07.41645905 +0000 UTC m=+68.095295723" Apr 24 21:17:08.379259 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:08.379220 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2mqlt" event={"ID":"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9","Type":"ContainerStarted","Data":"464a63419530a3edfd8d80f63f50a54d1697bc2f130d92a7ea61f0571be3e842"} Apr 24 21:17:10.394919 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:10.394858 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2mqlt" event={"ID":"ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9","Type":"ContainerStarted","Data":"facb453baaf488ed02386b9cfca7a1e7e71a912ede65e57ff627f8061011b761"} Apr 24 21:17:10.397293 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:10.397261 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n487x" event={"ID":"657a2c9b-4e75-4d61-bff2-d8abdd05825d","Type":"ContainerStarted","Data":"0c620f2c126c74f2571fa0e39e5a7e497b1fcfc6898c6103abe93fedc9bbe235"} Apr 24 21:17:10.402816 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:10.401321 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pxf27" event={"ID":"aefd5dcc-7f58-4fab-8028-2cffcff95339","Type":"ContainerStarted","Data":"19814c46ad18757541fa6897aca68e5eab450dc00784a7f768c02567e0338a57"} Apr 24 21:17:10.404761 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:10.404739 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5" event={"ID":"a4c82719-9c98-4a75-864d-75fb12509cb1","Type":"ContainerStarted","Data":"1b2b86502d5fbd2c5449097d2aa0c5899c73a493af9640be8ecadd22f056227e"} Apr 24 21:17:10.406786 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:10.406726 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w76jt" event={"ID":"d79fb269-2ec1-4e09-a8d6-9b2a367d21b6","Type":"ContainerStarted","Data":"90a4aff75124238bd80fbb7e3d2fe20f034d20b832cbcbdfb52f5cd6f38a5462"} Apr 24 21:17:10.409629 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:10.409100 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7q4x" event={"ID":"0a690ce8-242f-4216-9ac1-7a4d0f94784b","Type":"ContainerStarted","Data":"17280d8041a9380940a8f617dcd0e5e8ce6b9238c77a87a3daec03218f73052f"} Apr 24 21:17:10.409629 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:10.409127 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7q4x" event={"ID":"0a690ce8-242f-4216-9ac1-7a4d0f94784b","Type":"ContainerStarted","Data":"376dbbc78bd42b8c5b65679068e441136912dbaefefa7ed05d4bbbfb13a1c3a9"} Apr 24 21:17:10.423255 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:10.422790 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2mqlt" podStartSLOduration=17.386284451 podStartE2EDuration="21.422772114s" podCreationTimestamp="2026-04-24 21:16:49 +0000 UTC" firstStartedPulling="2026-04-24 21:17:06.14211197 +0000 UTC m=+66.820948622" lastFinishedPulling="2026-04-24 21:17:10.17859962 +0000 UTC m=+70.857436285" observedRunningTime="2026-04-24 21:17:10.421012399 +0000 UTC m=+71.099849111" watchObservedRunningTime="2026-04-24 21:17:10.422772114 +0000 UTC m=+71.101608788" Apr 24 21:17:10.457451 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:10.456593 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lhgp5" podStartSLOduration=60.194575869 podStartE2EDuration="1m4.456546162s" podCreationTimestamp="2026-04-24 21:16:06 +0000 UTC" firstStartedPulling="2026-04-24 21:17:05.844827726 +0000 UTC m=+66.523664400" lastFinishedPulling="2026-04-24 21:17:10.106798039 +0000 UTC m=+70.785634693" observedRunningTime="2026-04-24 21:17:10.439877063 +0000 UTC m=+71.118713736" watchObservedRunningTime="2026-04-24 21:17:10.456546162 +0000 UTC m=+71.135382834" Apr 24 21:17:10.457451 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:10.457236 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-pxf27" podStartSLOduration=33.227716905 podStartE2EDuration="37.45722603s" podCreationTimestamp="2026-04-24 21:16:33 +0000 UTC" firstStartedPulling="2026-04-24 21:17:05.87722541 +0000 UTC m=+66.556062062" lastFinishedPulling="2026-04-24 21:17:10.106734528 +0000 UTC m=+70.785571187" observedRunningTime="2026-04-24 21:17:10.455650039 +0000 UTC m=+71.134486711" watchObservedRunningTime="2026-04-24 21:17:10.45722603 +0000 UTC m=+71.136062703" Apr 24 21:17:10.478914 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:10.478805 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7q4x" podStartSLOduration=60.227634921 podStartE2EDuration="1m4.47878258s" podCreationTimestamp="2026-04-24 21:16:06 +0000 UTC" firstStartedPulling="2026-04-24 21:17:05.855468737 +0000 UTC m=+66.534305390" lastFinishedPulling="2026-04-24 21:17:10.106616384 +0000 UTC m=+70.785453049" observedRunningTime="2026-04-24 21:17:10.478664007 +0000 UTC m=+71.157500690" watchObservedRunningTime="2026-04-24 21:17:10.47878258 +0000 UTC m=+71.157619261" Apr 24 21:17:11.414651 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:11.414617 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w76jt" event={"ID":"d79fb269-2ec1-4e09-a8d6-9b2a367d21b6","Type":"ContainerStarted","Data":"c5a07891491f630c4b29cae496d844431a4e97a3f557588214e62d9d77ba97bd"} Apr 24 21:17:11.415095 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:11.414775 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-w76jt" Apr 24 21:17:11.416280 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:11.416251 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n487x" event={"ID":"657a2c9b-4e75-4d61-bff2-d8abdd05825d","Type":"ContainerStarted","Data":"9f7ca46999630bd1708a330772e6312a91ae8ac4e606372adfa1519c851042ea"} Apr 24 21:17:11.430632 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:11.430588 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-w76jt" podStartSLOduration=34.225914865 podStartE2EDuration="38.430576606s" podCreationTimestamp="2026-04-24 21:16:33 +0000 UTC" firstStartedPulling="2026-04-24 21:17:05.902294412 +0000 UTC m=+66.581131063" lastFinishedPulling="2026-04-24 21:17:10.106956141 +0000 UTC m=+70.785792804" observedRunningTime="2026-04-24 21:17:11.430006438 +0000 UTC m=+72.108843111" watchObservedRunningTime="2026-04-24 21:17:11.430576606 +0000 UTC m=+72.109413278" Apr 24 21:17:11.444416 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:11.444373 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-n487x" podStartSLOduration=67.351128216 podStartE2EDuration="1m11.444361398s" podCreationTimestamp="2026-04-24 21:16:00 +0000 UTC" firstStartedPulling="2026-04-24 21:17:06.013710564 +0000 UTC m=+66.692547230" lastFinishedPulling="2026-04-24 21:17:10.10694375 +0000 UTC m=+70.785780412" observedRunningTime="2026-04-24 21:17:11.44303441 +0000 UTC m=+72.121871082" watchObservedRunningTime="2026-04-24 21:17:11.444361398 +0000 UTC m=+72.123198070" Apr 24 21:17:16.858228 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:16.858189 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-mlhtc"] Apr 24 21:17:16.919596 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:16.919547 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-mlhtc"] Apr 24 21:17:16.919596 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:16.919598 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-68586bbdd8-8kw45"] Apr 24 21:17:16.920020 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:16.919679 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-mlhtc" Apr 24 21:17:16.924223 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:16.924200 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 21:17:16.924894 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:16.924878 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 21:17:16.925037 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:16.925023 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-z6grj\"" Apr 24 21:17:16.968819 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:16.968790 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fthv5"] Apr 24 21:17:16.972657 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:16.972633 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcpm4\" (UniqueName: \"kubernetes.io/projected/6eacc4cf-b65c-4484-bacb-a5c0e01cefac-kube-api-access-pcpm4\") pod \"downloads-6bcc868b7-mlhtc\" (UID: \"6eacc4cf-b65c-4484-bacb-a5c0e01cefac\") " pod="openshift-console/downloads-6bcc868b7-mlhtc" Apr 24 21:17:16.995514 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:16.995485 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fthv5"] Apr 24 21:17:16.995645 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:16.995593 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fthv5" Apr 24 21:17:16.998165 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:16.998141 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 21:17:16.998276 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:16.998188 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-j5wqg\"" Apr 24 21:17:17.073986 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:17.073953 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/05babe0a-65a7-4208-bdf7-46e2e3b914e9-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-fthv5\" (UID: \"05babe0a-65a7-4208-bdf7-46e2e3b914e9\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fthv5" Apr 24 21:17:17.074138 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:17.074054 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcpm4\" (UniqueName: \"kubernetes.io/projected/6eacc4cf-b65c-4484-bacb-a5c0e01cefac-kube-api-access-pcpm4\") pod \"downloads-6bcc868b7-mlhtc\" (UID: \"6eacc4cf-b65c-4484-bacb-a5c0e01cefac\") " pod="openshift-console/downloads-6bcc868b7-mlhtc" Apr 24 21:17:17.084661 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:17.084608 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcpm4\" (UniqueName: \"kubernetes.io/projected/6eacc4cf-b65c-4484-bacb-a5c0e01cefac-kube-api-access-pcpm4\") pod \"downloads-6bcc868b7-mlhtc\" (UID: \"6eacc4cf-b65c-4484-bacb-a5c0e01cefac\") " pod="openshift-console/downloads-6bcc868b7-mlhtc" Apr 24 21:17:17.175072 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:17.175042 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/05babe0a-65a7-4208-bdf7-46e2e3b914e9-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-fthv5\" (UID: \"05babe0a-65a7-4208-bdf7-46e2e3b914e9\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fthv5" Apr 24 21:17:17.177330 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:17.177310 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/05babe0a-65a7-4208-bdf7-46e2e3b914e9-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-fthv5\" (UID: \"05babe0a-65a7-4208-bdf7-46e2e3b914e9\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fthv5" Apr 24 21:17:17.230785 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:17.230765 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-mlhtc" Apr 24 21:17:17.305729 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:17.305634 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fthv5" Apr 24 21:17:17.360823 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:17.360789 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-mlhtc"] Apr 24 21:17:17.369813 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:17:17.369782 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eacc4cf_b65c_4484_bacb_a5c0e01cefac.slice/crio-66135c2b1a71af4d859a461e950b4adbe8eabc19920d89c4c60c8d5f1eca24c4 WatchSource:0}: Error finding container 66135c2b1a71af4d859a461e950b4adbe8eabc19920d89c4c60c8d5f1eca24c4: Status 404 returned error can't find the container with id 66135c2b1a71af4d859a461e950b4adbe8eabc19920d89c4c60c8d5f1eca24c4 Apr 24 21:17:17.433887 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:17.433859 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-mlhtc" event={"ID":"6eacc4cf-b65c-4484-bacb-a5c0e01cefac","Type":"ContainerStarted","Data":"66135c2b1a71af4d859a461e950b4adbe8eabc19920d89c4c60c8d5f1eca24c4"} Apr 24 21:17:17.439986 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:17.439966 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fthv5"] Apr 24 21:17:17.443823 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:17:17.443802 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05babe0a_65a7_4208_bdf7_46e2e3b914e9.slice/crio-3504c02e1e4a0bedafebef26c9783c6738c4b6a2e69b51e1246f7aa7e6077efe WatchSource:0}: Error finding container 3504c02e1e4a0bedafebef26c9783c6738c4b6a2e69b51e1246f7aa7e6077efe: Status 404 returned error can't find the container with id 3504c02e1e4a0bedafebef26c9783c6738c4b6a2e69b51e1246f7aa7e6077efe Apr 24 21:17:18.078245 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.078213 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-d4f99bf67-2gt8x"] Apr 24 21:17:18.095252 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.095218 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:18.095603 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.095580 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d4f99bf67-2gt8x"] Apr 24 21:17:18.098229 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.098200 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 21:17:18.098760 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.098743 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-stf74\"" Apr 24 21:17:18.100330 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.100298 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 21:17:18.101285 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.100835 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 21:17:18.101285 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.100870 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 21:17:18.101285 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.101098 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 21:17:18.182607 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.182535 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ffae187-6cbb-4cdc-a65c-42f879939f6d-console-config\") pod \"console-d4f99bf67-2gt8x\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:18.182607 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.182571 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ffae187-6cbb-4cdc-a65c-42f879939f6d-service-ca\") pod \"console-d4f99bf67-2gt8x\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:18.182607 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.182605 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ffae187-6cbb-4cdc-a65c-42f879939f6d-console-serving-cert\") pod \"console-d4f99bf67-2gt8x\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:18.182848 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.182648 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ffae187-6cbb-4cdc-a65c-42f879939f6d-console-oauth-config\") pod \"console-d4f99bf67-2gt8x\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:18.182848 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.182696 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrwmd\" (UniqueName: \"kubernetes.io/projected/9ffae187-6cbb-4cdc-a65c-42f879939f6d-kube-api-access-qrwmd\") pod \"console-d4f99bf67-2gt8x\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:18.182848 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.182763 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ffae187-6cbb-4cdc-a65c-42f879939f6d-oauth-serving-cert\") pod \"console-d4f99bf67-2gt8x\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:18.284403 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.283554 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ffae187-6cbb-4cdc-a65c-42f879939f6d-console-config\") pod \"console-d4f99bf67-2gt8x\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:18.284403 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.283608 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ffae187-6cbb-4cdc-a65c-42f879939f6d-service-ca\") pod \"console-d4f99bf67-2gt8x\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:18.284403 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.283644 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ffae187-6cbb-4cdc-a65c-42f879939f6d-console-serving-cert\") pod \"console-d4f99bf67-2gt8x\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:18.284403 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.283706 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ffae187-6cbb-4cdc-a65c-42f879939f6d-console-oauth-config\") pod \"console-d4f99bf67-2gt8x\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:18.284403 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.283730 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrwmd\" (UniqueName: \"kubernetes.io/projected/9ffae187-6cbb-4cdc-a65c-42f879939f6d-kube-api-access-qrwmd\") pod \"console-d4f99bf67-2gt8x\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:18.284403 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.283779 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ffae187-6cbb-4cdc-a65c-42f879939f6d-oauth-serving-cert\") pod \"console-d4f99bf67-2gt8x\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:18.284403 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.284364 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ffae187-6cbb-4cdc-a65c-42f879939f6d-console-config\") pod \"console-d4f99bf67-2gt8x\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:18.284876 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.284381 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ffae187-6cbb-4cdc-a65c-42f879939f6d-oauth-serving-cert\") pod \"console-d4f99bf67-2gt8x\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:18.284876 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.284718 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ffae187-6cbb-4cdc-a65c-42f879939f6d-service-ca\") pod \"console-d4f99bf67-2gt8x\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:18.287006 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.286980 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ffae187-6cbb-4cdc-a65c-42f879939f6d-console-serving-cert\") pod \"console-d4f99bf67-2gt8x\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:18.287170 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.287147 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ffae187-6cbb-4cdc-a65c-42f879939f6d-console-oauth-config\") pod \"console-d4f99bf67-2gt8x\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:18.293714 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.293687 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrwmd\" (UniqueName: \"kubernetes.io/projected/9ffae187-6cbb-4cdc-a65c-42f879939f6d-kube-api-access-qrwmd\") pod \"console-d4f99bf67-2gt8x\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:18.408885 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.408795 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:18.438532 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.438494 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fthv5" event={"ID":"05babe0a-65a7-4208-bdf7-46e2e3b914e9","Type":"ContainerStarted","Data":"3504c02e1e4a0bedafebef26c9783c6738c4b6a2e69b51e1246f7aa7e6077efe"} Apr 24 21:17:18.922013 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:18.921981 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d4f99bf67-2gt8x"] Apr 24 21:17:18.999732 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:17:18.999696 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ffae187_6cbb_4cdc_a65c_42f879939f6d.slice/crio-73c44fffe8758501123416d61472579d80592def64e305280657cde2b7ee69fa WatchSource:0}: Error finding container 73c44fffe8758501123416d61472579d80592def64e305280657cde2b7ee69fa: Status 404 returned error can't find the container with id 73c44fffe8758501123416d61472579d80592def64e305280657cde2b7ee69fa Apr 24 21:17:19.285719 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:19.285578 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-5shjj" Apr 24 21:17:19.443010 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:19.442965 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fthv5" event={"ID":"05babe0a-65a7-4208-bdf7-46e2e3b914e9","Type":"ContainerStarted","Data":"e8a66eae91c48d3e1a6d4a11a86bde216664a55f71c9be01fca2ac9f260e4752"} Apr 24 21:17:19.443193 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:19.443169 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fthv5" Apr 24 21:17:19.444298 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:19.444239 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d4f99bf67-2gt8x" event={"ID":"9ffae187-6cbb-4cdc-a65c-42f879939f6d","Type":"ContainerStarted","Data":"73c44fffe8758501123416d61472579d80592def64e305280657cde2b7ee69fa"} Apr 24 21:17:19.448590 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:19.448566 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fthv5" Apr 24 21:17:19.464806 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:19.464766 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fthv5" podStartSLOduration=1.8939811519999998 podStartE2EDuration="3.464750285s" podCreationTimestamp="2026-04-24 21:17:16 +0000 UTC" firstStartedPulling="2026-04-24 21:17:17.445594662 +0000 UTC m=+78.124431315" lastFinishedPulling="2026-04-24 21:17:19.016363795 +0000 UTC m=+79.695200448" observedRunningTime="2026-04-24 21:17:19.464297319 +0000 UTC m=+80.143133992" watchObservedRunningTime="2026-04-24 21:17:19.464750285 +0000 UTC m=+80.143586959" Apr 24 21:17:19.864269 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:19.864240 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-v7997"] Apr 24 21:17:19.896933 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:19.896905 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-v7997"] Apr 24 21:17:19.897093 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:19.897063 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-v7997" Apr 24 21:17:19.900032 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:19.900007 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-c26pp\"" Apr 24 21:17:19.900198 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:19.900183 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:17:19.900361 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:19.900344 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 21:17:19.900449 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:19.900236 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 21:17:20.001064 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:20.000991 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psgvd\" (UniqueName: \"kubernetes.io/projected/de8a23e4-07a9-437d-baed-a6d60d4f5485-kube-api-access-psgvd\") pod \"prometheus-operator-5676c8c784-v7997\" (UID: \"de8a23e4-07a9-437d-baed-a6d60d4f5485\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v7997" Apr 24 21:17:20.001064 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:20.001046 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/de8a23e4-07a9-437d-baed-a6d60d4f5485-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-v7997\" (UID: \"de8a23e4-07a9-437d-baed-a6d60d4f5485\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v7997" Apr 24 21:17:20.001299 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:20.001127 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de8a23e4-07a9-437d-baed-a6d60d4f5485-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-v7997\" (UID: \"de8a23e4-07a9-437d-baed-a6d60d4f5485\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v7997" Apr 24 21:17:20.001299 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:20.001193 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/de8a23e4-07a9-437d-baed-a6d60d4f5485-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-v7997\" (UID: \"de8a23e4-07a9-437d-baed-a6d60d4f5485\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v7997" Apr 24 21:17:20.102511 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:20.102466 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de8a23e4-07a9-437d-baed-a6d60d4f5485-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-v7997\" (UID: \"de8a23e4-07a9-437d-baed-a6d60d4f5485\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v7997" Apr 24 21:17:20.102709 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:20.102555 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/de8a23e4-07a9-437d-baed-a6d60d4f5485-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-v7997\" (UID: \"de8a23e4-07a9-437d-baed-a6d60d4f5485\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v7997" Apr 24 21:17:20.102709 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:20.102596 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psgvd\" (UniqueName: \"kubernetes.io/projected/de8a23e4-07a9-437d-baed-a6d60d4f5485-kube-api-access-psgvd\") pod \"prometheus-operator-5676c8c784-v7997\" (UID: \"de8a23e4-07a9-437d-baed-a6d60d4f5485\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v7997" Apr 24 21:17:20.102709 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:20.102631 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/de8a23e4-07a9-437d-baed-a6d60d4f5485-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-v7997\" (UID: \"de8a23e4-07a9-437d-baed-a6d60d4f5485\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v7997" Apr 24 21:17:20.102881 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:17:20.102711 2581 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 24 21:17:20.102881 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:17:20.102790 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8a23e4-07a9-437d-baed-a6d60d4f5485-prometheus-operator-tls podName:de8a23e4-07a9-437d-baed-a6d60d4f5485 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:20.602767684 +0000 UTC m=+81.281604337 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/de8a23e4-07a9-437d-baed-a6d60d4f5485-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-v7997" (UID: "de8a23e4-07a9-437d-baed-a6d60d4f5485") : secret "prometheus-operator-tls" not found Apr 24 21:17:20.103267 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:20.103220 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de8a23e4-07a9-437d-baed-a6d60d4f5485-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-v7997\" (UID: \"de8a23e4-07a9-437d-baed-a6d60d4f5485\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v7997" Apr 24 21:17:20.105234 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:20.105207 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/de8a23e4-07a9-437d-baed-a6d60d4f5485-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-v7997\" (UID: \"de8a23e4-07a9-437d-baed-a6d60d4f5485\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v7997" Apr 24 21:17:20.116642 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:20.116567 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psgvd\" (UniqueName: \"kubernetes.io/projected/de8a23e4-07a9-437d-baed-a6d60d4f5485-kube-api-access-psgvd\") pod \"prometheus-operator-5676c8c784-v7997\" (UID: \"de8a23e4-07a9-437d-baed-a6d60d4f5485\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v7997" Apr 24 21:17:20.609173 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:20.609133 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/de8a23e4-07a9-437d-baed-a6d60d4f5485-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-v7997\" (UID: \"de8a23e4-07a9-437d-baed-a6d60d4f5485\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v7997" Apr 24 21:17:20.611899 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:20.611864 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/de8a23e4-07a9-437d-baed-a6d60d4f5485-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-v7997\" (UID: \"de8a23e4-07a9-437d-baed-a6d60d4f5485\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v7997" Apr 24 21:17:20.809897 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:20.809852 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-v7997" Apr 24 21:17:20.965900 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:20.965845 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-v7997"] Apr 24 21:17:20.969767 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:17:20.969735 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde8a23e4_07a9_437d_baed_a6d60d4f5485.slice/crio-02387c788c056eef87d7b96f759e9fa628f7297c5e1b5d9104e9bdb1550dfb12 WatchSource:0}: Error finding container 02387c788c056eef87d7b96f759e9fa628f7297c5e1b5d9104e9bdb1550dfb12: Status 404 returned error can't find the container with id 02387c788c056eef87d7b96f759e9fa628f7297c5e1b5d9104e9bdb1550dfb12 Apr 24 21:17:21.421248 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:21.421213 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-w76jt" Apr 24 21:17:21.453451 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:21.453383 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-v7997" event={"ID":"de8a23e4-07a9-437d-baed-a6d60d4f5485","Type":"ContainerStarted","Data":"02387c788c056eef87d7b96f759e9fa628f7297c5e1b5d9104e9bdb1550dfb12"} Apr 24 21:17:23.464247 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:23.464197 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d4f99bf67-2gt8x" event={"ID":"9ffae187-6cbb-4cdc-a65c-42f879939f6d","Type":"ContainerStarted","Data":"786491737acba8faefc8bfafe45ad106c497f233564d0c932ca4542d3625dfdd"} Apr 24 21:17:23.483267 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:23.483195 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-d4f99bf67-2gt8x" podStartSLOduration=1.4793344849999999 podStartE2EDuration="5.483177658s" podCreationTimestamp="2026-04-24 21:17:18 +0000 UTC" firstStartedPulling="2026-04-24 21:17:19.012729718 +0000 UTC m=+79.691566374" lastFinishedPulling="2026-04-24 21:17:23.016572883 +0000 UTC m=+83.695409547" observedRunningTime="2026-04-24 21:17:23.482492007 +0000 UTC m=+84.161328677" watchObservedRunningTime="2026-04-24 21:17:23.483177658 +0000 UTC m=+84.162014331" Apr 24 21:17:24.471630 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:24.471599 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-v7997" event={"ID":"de8a23e4-07a9-437d-baed-a6d60d4f5485","Type":"ContainerStarted","Data":"a90c5e0c7272a7c64d2703e4381be31888e80a3fe5e3f3349cb3314874f7ffa6"} Apr 24 21:17:25.476567 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:25.476526 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-v7997" event={"ID":"de8a23e4-07a9-437d-baed-a6d60d4f5485","Type":"ContainerStarted","Data":"b38be2d38d95fe70f568cfc8153d8b091e3e42977042cf8303b12d490d42abf9"} Apr 24 21:17:25.498779 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:25.498727 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-v7997" podStartSLOduration=3.135377255 podStartE2EDuration="6.498712063s" podCreationTimestamp="2026-04-24 21:17:19 +0000 UTC" firstStartedPulling="2026-04-24 21:17:20.97245147 +0000 UTC m=+81.651288123" lastFinishedPulling="2026-04-24 21:17:24.335786267 +0000 UTC m=+85.014622931" observedRunningTime="2026-04-24 21:17:25.497442789 +0000 UTC m=+86.176279457" watchObservedRunningTime="2026-04-24 21:17:25.498712063 +0000 UTC m=+86.177548736" Apr 24 21:17:25.559882 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:25.559840 2581 patch_prober.go:28] interesting pod/image-registry-6c69778f6c-6k4sq container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 21:17:25.560038 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:25.559903 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" podUID="ff062d15-1ff3-4d8b-92be-3341e5f59abb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:17:26.925094 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:26.925061 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:17:27.383555 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.383478 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:17:27.496133 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.495733 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-grgjm"] Apr 24 21:17:27.501200 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.501179 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.504044 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.504023 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2svx2\"" Apr 24 21:17:27.504446 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.504305 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:17:27.504699 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.504677 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:17:27.505398 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.505216 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:17:27.588842 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.588808 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/44be5a66-af42-4898-93c0-df61266a91dd-node-exporter-accelerators-collector-config\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.589012 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.588868 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/44be5a66-af42-4898-93c0-df61266a91dd-root\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.589012 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.588915 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmhx7\" (UniqueName: \"kubernetes.io/projected/44be5a66-af42-4898-93c0-df61266a91dd-kube-api-access-nmhx7\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.589012 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.588994 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/44be5a66-af42-4898-93c0-df61266a91dd-sys\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.589169 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.589049 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/44be5a66-af42-4898-93c0-df61266a91dd-metrics-client-ca\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.589169 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.589092 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/44be5a66-af42-4898-93c0-df61266a91dd-node-exporter-textfile\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.589169 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.589131 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/44be5a66-af42-4898-93c0-df61266a91dd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.589169 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.589165 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/44be5a66-af42-4898-93c0-df61266a91dd-node-exporter-tls\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.589365 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.589196 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/44be5a66-af42-4898-93c0-df61266a91dd-node-exporter-wtmp\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.690271 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.690231 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmhx7\" (UniqueName: \"kubernetes.io/projected/44be5a66-af42-4898-93c0-df61266a91dd-kube-api-access-nmhx7\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.690465 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.690307 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/44be5a66-af42-4898-93c0-df61266a91dd-sys\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.690465 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.690337 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/44be5a66-af42-4898-93c0-df61266a91dd-metrics-client-ca\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.690465 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.690373 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/44be5a66-af42-4898-93c0-df61266a91dd-node-exporter-textfile\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.690465 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.690409 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/44be5a66-af42-4898-93c0-df61266a91dd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.690465 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.690453 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/44be5a66-af42-4898-93c0-df61266a91dd-node-exporter-tls\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.690738 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.690487 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/44be5a66-af42-4898-93c0-df61266a91dd-node-exporter-wtmp\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.690738 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.690526 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/44be5a66-af42-4898-93c0-df61266a91dd-node-exporter-accelerators-collector-config\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.690738 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.690569 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/44be5a66-af42-4898-93c0-df61266a91dd-root\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.690738 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.690673 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/44be5a66-af42-4898-93c0-df61266a91dd-root\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.690919 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:17:27.690767 2581 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 21:17:27.690919 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:17:27.690822 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44be5a66-af42-4898-93c0-df61266a91dd-node-exporter-tls podName:44be5a66-af42-4898-93c0-df61266a91dd nodeName:}" failed. No retries permitted until 2026-04-24 21:17:28.190802874 +0000 UTC m=+88.869639529 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/44be5a66-af42-4898-93c0-df61266a91dd-node-exporter-tls") pod "node-exporter-grgjm" (UID: "44be5a66-af42-4898-93c0-df61266a91dd") : secret "node-exporter-tls" not found Apr 24 21:17:27.691485 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.691089 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/44be5a66-af42-4898-93c0-df61266a91dd-node-exporter-wtmp\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.691485 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.691133 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/44be5a66-af42-4898-93c0-df61266a91dd-sys\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.691485 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.691414 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/44be5a66-af42-4898-93c0-df61266a91dd-node-exporter-textfile\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.691697 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.691647 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/44be5a66-af42-4898-93c0-df61266a91dd-metrics-client-ca\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.691697 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.691672 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/44be5a66-af42-4898-93c0-df61266a91dd-node-exporter-accelerators-collector-config\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.693343 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.693307 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/44be5a66-af42-4898-93c0-df61266a91dd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:27.708815 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:27.708779 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmhx7\" (UniqueName: \"kubernetes.io/projected/44be5a66-af42-4898-93c0-df61266a91dd-kube-api-access-nmhx7\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:28.195525 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:28.195490 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/44be5a66-af42-4898-93c0-df61266a91dd-node-exporter-tls\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:28.198711 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:28.198686 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/44be5a66-af42-4898-93c0-df61266a91dd-node-exporter-tls\") pod \"node-exporter-grgjm\" (UID: \"44be5a66-af42-4898-93c0-df61266a91dd\") " pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:28.409041 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:28.409001 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:28.409222 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:28.409057 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:28.413709 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:28.413671 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:28.416150 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:28.416131 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-grgjm" Apr 24 21:17:28.425541 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:17:28.425518 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44be5a66_af42_4898_93c0_df61266a91dd.slice/crio-506b38b02199b62eceed2137eb28d3ec0c04f5460ba368f66aab56a637843cf4 WatchSource:0}: Error finding container 506b38b02199b62eceed2137eb28d3ec0c04f5460ba368f66aab56a637843cf4: Status 404 returned error can't find the container with id 506b38b02199b62eceed2137eb28d3ec0c04f5460ba368f66aab56a637843cf4 Apr 24 21:17:28.488959 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:28.488872 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-grgjm" event={"ID":"44be5a66-af42-4898-93c0-df61266a91dd","Type":"ContainerStarted","Data":"506b38b02199b62eceed2137eb28d3ec0c04f5460ba368f66aab56a637843cf4"} Apr 24 21:17:28.494570 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:28.494518 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:17:36.522404 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:36.522366 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-mlhtc" event={"ID":"6eacc4cf-b65c-4484-bacb-a5c0e01cefac","Type":"ContainerStarted","Data":"7e718d71bfe64b2b304b52cb6bfa5d0298548e9215fa752758bb591712724722"} Apr 24 21:17:36.522843 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:36.522691 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-mlhtc" Apr 24 21:17:36.524012 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:36.523979 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-grgjm" event={"ID":"44be5a66-af42-4898-93c0-df61266a91dd","Type":"ContainerStarted","Data":"da82194b8ec0317b0dc06db285a69551eb53151ef712412d6ae138c4c1244d5d"} Apr 24 21:17:36.524271 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:36.524201 2581 patch_prober.go:28] interesting pod/downloads-6bcc868b7-mlhtc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.132.0.26:8080/\": dial tcp 10.132.0.26:8080: connect: connection refused" start-of-body= Apr 24 21:17:36.524271 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:36.524250 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-mlhtc" podUID="6eacc4cf-b65c-4484-bacb-a5c0e01cefac" containerName="download-server" probeResult="failure" output="Get \"http://10.132.0.26:8080/\": dial tcp 10.132.0.26:8080: connect: connection refused" Apr 24 21:17:36.557718 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:36.557659 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-mlhtc" podStartSLOduration=1.490033916 podStartE2EDuration="20.557644824s" podCreationTimestamp="2026-04-24 21:17:16 +0000 UTC" firstStartedPulling="2026-04-24 21:17:17.377914927 +0000 UTC m=+78.056751590" lastFinishedPulling="2026-04-24 21:17:36.445525845 +0000 UTC m=+97.124362498" observedRunningTime="2026-04-24 21:17:36.555979929 +0000 UTC m=+97.234816603" watchObservedRunningTime="2026-04-24 21:17:36.557644824 +0000 UTC m=+97.236481553" Apr 24 21:17:37.530136 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:37.530090 2581 generic.go:358] "Generic (PLEG): container finished" podID="44be5a66-af42-4898-93c0-df61266a91dd" containerID="da82194b8ec0317b0dc06db285a69551eb53151ef712412d6ae138c4c1244d5d" exitCode=0 Apr 24 21:17:37.530639 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:37.530214 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-grgjm" event={"ID":"44be5a66-af42-4898-93c0-df61266a91dd","Type":"ContainerDied","Data":"da82194b8ec0317b0dc06db285a69551eb53151ef712412d6ae138c4c1244d5d"} Apr 24 21:17:37.541292 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:37.541265 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-mlhtc" Apr 24 21:17:38.536160 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:38.536119 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-grgjm" event={"ID":"44be5a66-af42-4898-93c0-df61266a91dd","Type":"ContainerStarted","Data":"d1da19909911d6bcdb91540692589ef267c6f487ca4a96bb841baee282dedd06"} Apr 24 21:17:38.536710 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:38.536681 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-grgjm" event={"ID":"44be5a66-af42-4898-93c0-df61266a91dd","Type":"ContainerStarted","Data":"d68b75574499cb79b821624426820e0fac4081de30aafea0ccad5009c9d8f4b0"} Apr 24 21:17:38.578967 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:38.578902 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-grgjm" podStartSLOduration=3.608528967 podStartE2EDuration="11.578880499s" podCreationTimestamp="2026-04-24 21:17:27 +0000 UTC" firstStartedPulling="2026-04-24 21:17:28.42798176 +0000 UTC m=+89.106818423" lastFinishedPulling="2026-04-24 21:17:36.398333292 +0000 UTC m=+97.077169955" observedRunningTime="2026-04-24 21:17:38.57714912 +0000 UTC m=+99.255985803" watchObservedRunningTime="2026-04-24 21:17:38.578880499 +0000 UTC m=+99.257717171" Apr 24 21:17:41.939545 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:41.939463 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" podUID="d77d521f-bac2-47f1-80bd-1a4c7f08c799" containerName="registry" containerID="cri-o://7aacf6cc3bab5ea4815f4df95840524fbc596fd149e8e3807316eeeaf82a53a9" gracePeriod=30 Apr 24 21:17:42.222755 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.222728 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:17:42.228106 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.228080 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-certificates\") pod \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " Apr 24 21:17:42.228226 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.228122 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-bound-sa-token\") pod \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " Apr 24 21:17:42.228226 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.228189 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q7sm\" (UniqueName: \"kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-kube-api-access-6q7sm\") pod \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " Apr 24 21:17:42.228337 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.228228 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d77d521f-bac2-47f1-80bd-1a4c7f08c799-installation-pull-secrets\") pod \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " Apr 24 21:17:42.228337 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.228258 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls\") pod \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " Apr 24 21:17:42.228337 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.228281 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d77d521f-bac2-47f1-80bd-1a4c7f08c799-image-registry-private-configuration\") pod \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " Apr 24 21:17:42.228337 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.228313 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d77d521f-bac2-47f1-80bd-1a4c7f08c799-trusted-ca\") pod \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " Apr 24 21:17:42.228518 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.228346 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d77d521f-bac2-47f1-80bd-1a4c7f08c799-ca-trust-extracted\") pod \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\" (UID: \"d77d521f-bac2-47f1-80bd-1a4c7f08c799\") " Apr 24 21:17:42.228518 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.228489 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d77d521f-bac2-47f1-80bd-1a4c7f08c799" (UID: "d77d521f-bac2-47f1-80bd-1a4c7f08c799"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:17:42.229083 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.228657 2581 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-certificates\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:17:42.229083 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.229044 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d77d521f-bac2-47f1-80bd-1a4c7f08c799-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d77d521f-bac2-47f1-80bd-1a4c7f08c799" (UID: "d77d521f-bac2-47f1-80bd-1a4c7f08c799"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:17:42.231412 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.231266 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d77d521f-bac2-47f1-80bd-1a4c7f08c799" (UID: "d77d521f-bac2-47f1-80bd-1a4c7f08c799"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:17:42.231412 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.231391 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d77d521f-bac2-47f1-80bd-1a4c7f08c799-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d77d521f-bac2-47f1-80bd-1a4c7f08c799" (UID: "d77d521f-bac2-47f1-80bd-1a4c7f08c799"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:17:42.231709 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.231621 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-kube-api-access-6q7sm" (OuterVolumeSpecName: "kube-api-access-6q7sm") pod "d77d521f-bac2-47f1-80bd-1a4c7f08c799" (UID: "d77d521f-bac2-47f1-80bd-1a4c7f08c799"). InnerVolumeSpecName "kube-api-access-6q7sm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:17:42.231709 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.231687 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d77d521f-bac2-47f1-80bd-1a4c7f08c799" (UID: "d77d521f-bac2-47f1-80bd-1a4c7f08c799"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:17:42.231992 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.231954 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d77d521f-bac2-47f1-80bd-1a4c7f08c799-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "d77d521f-bac2-47f1-80bd-1a4c7f08c799" (UID: "d77d521f-bac2-47f1-80bd-1a4c7f08c799"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:17:42.242600 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.242573 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d77d521f-bac2-47f1-80bd-1a4c7f08c799-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d77d521f-bac2-47f1-80bd-1a4c7f08c799" (UID: "d77d521f-bac2-47f1-80bd-1a4c7f08c799"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:17:42.330002 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.329964 2581 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d77d521f-bac2-47f1-80bd-1a4c7f08c799-installation-pull-secrets\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:17:42.330173 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.330009 2581 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-registry-tls\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:17:42.330173 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.330028 2581 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d77d521f-bac2-47f1-80bd-1a4c7f08c799-image-registry-private-configuration\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:17:42.330173 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.330042 2581 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d77d521f-bac2-47f1-80bd-1a4c7f08c799-trusted-ca\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:17:42.330173 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.330057 2581 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d77d521f-bac2-47f1-80bd-1a4c7f08c799-ca-trust-extracted\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:17:42.330173 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.330069 2581 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-bound-sa-token\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:17:42.330173 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.330078 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6q7sm\" (UniqueName: \"kubernetes.io/projected/d77d521f-bac2-47f1-80bd-1a4c7f08c799-kube-api-access-6q7sm\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:17:42.552982 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.552689 2581 generic.go:358] "Generic (PLEG): container finished" podID="d77d521f-bac2-47f1-80bd-1a4c7f08c799" containerID="7aacf6cc3bab5ea4815f4df95840524fbc596fd149e8e3807316eeeaf82a53a9" exitCode=0 Apr 24 21:17:42.552982 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.552766 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" event={"ID":"d77d521f-bac2-47f1-80bd-1a4c7f08c799","Type":"ContainerDied","Data":"7aacf6cc3bab5ea4815f4df95840524fbc596fd149e8e3807316eeeaf82a53a9"} Apr 24 21:17:42.552982 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.552784 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" Apr 24 21:17:42.552982 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.552803 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-68586bbdd8-8kw45" event={"ID":"d77d521f-bac2-47f1-80bd-1a4c7f08c799","Type":"ContainerDied","Data":"f4e8e8e1642fb6c2e459b31fd750452fa6b8f2627f7ce4f4c63005f60ab2d895"} Apr 24 21:17:42.552982 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.552823 2581 scope.go:117] "RemoveContainer" containerID="7aacf6cc3bab5ea4815f4df95840524fbc596fd149e8e3807316eeeaf82a53a9" Apr 24 21:17:42.567415 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.567390 2581 scope.go:117] "RemoveContainer" containerID="7aacf6cc3bab5ea4815f4df95840524fbc596fd149e8e3807316eeeaf82a53a9" Apr 24 21:17:42.567773 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:17:42.567743 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aacf6cc3bab5ea4815f4df95840524fbc596fd149e8e3807316eeeaf82a53a9\": container with ID starting with 7aacf6cc3bab5ea4815f4df95840524fbc596fd149e8e3807316eeeaf82a53a9 not found: ID does not exist" containerID="7aacf6cc3bab5ea4815f4df95840524fbc596fd149e8e3807316eeeaf82a53a9" Apr 24 21:17:42.567875 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.567781 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aacf6cc3bab5ea4815f4df95840524fbc596fd149e8e3807316eeeaf82a53a9"} err="failed to get container status \"7aacf6cc3bab5ea4815f4df95840524fbc596fd149e8e3807316eeeaf82a53a9\": rpc error: code = NotFound desc = could not find container \"7aacf6cc3bab5ea4815f4df95840524fbc596fd149e8e3807316eeeaf82a53a9\": container with ID starting with 7aacf6cc3bab5ea4815f4df95840524fbc596fd149e8e3807316eeeaf82a53a9 not found: ID does not exist" Apr 24 21:17:42.612999 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.612968 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-68586bbdd8-8kw45"] Apr 24 21:17:42.641071 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:42.641038 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-68586bbdd8-8kw45"] Apr 24 21:17:43.875111 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:43.875073 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d77d521f-bac2-47f1-80bd-1a4c7f08c799" path="/var/lib/kubelet/pods/d77d521f-bac2-47f1-80bd-1a4c7f08c799/volumes" Apr 24 21:17:44.264877 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:44.264838 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6c69778f6c-6k4sq"] Apr 24 21:17:51.415173 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:17:51.415138 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d4f99bf67-2gt8x"] Apr 24 21:18:07.639493 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:07.639456 2581 generic.go:358] "Generic (PLEG): container finished" podID="cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45" containerID="6760dd643817561786fe97e103fe9eca3c770fec38fa7034557c267750556587" exitCode=0 Apr 24 21:18:07.639846 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:07.639532 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z962g" event={"ID":"cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45","Type":"ContainerDied","Data":"6760dd643817561786fe97e103fe9eca3c770fec38fa7034557c267750556587"} Apr 24 21:18:07.639846 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:07.639834 2581 scope.go:117] "RemoveContainer" containerID="6760dd643817561786fe97e103fe9eca3c770fec38fa7034557c267750556587" Apr 24 21:18:08.644641 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:08.644609 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z962g" event={"ID":"cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45","Type":"ContainerStarted","Data":"40212d62015168839ffe941cd38ad0e8e6d910e382620701ff2503ce4ad6797b"} Apr 24 21:18:09.287708 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.287648 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" podUID="ff062d15-1ff3-4d8b-92be-3341e5f59abb" containerName="registry" containerID="cri-o://f6b26feb5555fdf68823f814316d1522177c787f78a7d8b0d6a3a262674ac958" gracePeriod=30 Apr 24 21:18:09.566000 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.565977 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:18:09.649345 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.649306 2581 generic.go:358] "Generic (PLEG): container finished" podID="ff062d15-1ff3-4d8b-92be-3341e5f59abb" containerID="f6b26feb5555fdf68823f814316d1522177c787f78a7d8b0d6a3a262674ac958" exitCode=0 Apr 24 21:18:09.649747 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.649352 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" event={"ID":"ff062d15-1ff3-4d8b-92be-3341e5f59abb","Type":"ContainerDied","Data":"f6b26feb5555fdf68823f814316d1522177c787f78a7d8b0d6a3a262674ac958"} Apr 24 21:18:09.649747 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.649366 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" Apr 24 21:18:09.649747 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.649378 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c69778f6c-6k4sq" event={"ID":"ff062d15-1ff3-4d8b-92be-3341e5f59abb","Type":"ContainerDied","Data":"4292814f213fe124b26a1670d0ffb582b63d9eb1e97c03de7ebc2d11defddc38"} Apr 24 21:18:09.649747 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.649395 2581 scope.go:117] "RemoveContainer" containerID="f6b26feb5555fdf68823f814316d1522177c787f78a7d8b0d6a3a262674ac958" Apr 24 21:18:09.652480 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.652462 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ff062d15-1ff3-4d8b-92be-3341e5f59abb-ca-trust-extracted\") pod \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " Apr 24 21:18:09.652580 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.652535 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls\") pod \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " Apr 24 21:18:09.652580 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.652571 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ff062d15-1ff3-4d8b-92be-3341e5f59abb-installation-pull-secrets\") pod \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " Apr 24 21:18:09.652679 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.652600 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-bound-sa-token\") pod \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " Apr 24 21:18:09.652679 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.652634 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff062d15-1ff3-4d8b-92be-3341e5f59abb-trusted-ca\") pod \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " Apr 24 21:18:09.652679 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.652659 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ff062d15-1ff3-4d8b-92be-3341e5f59abb-image-registry-private-configuration\") pod \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " Apr 24 21:18:09.652827 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.652704 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fxtg\" (UniqueName: \"kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-kube-api-access-2fxtg\") pod \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " Apr 24 21:18:09.652827 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.652794 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-certificates\") pod \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\" (UID: \"ff062d15-1ff3-4d8b-92be-3341e5f59abb\") " Apr 24 21:18:09.653610 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.653398 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ff062d15-1ff3-4d8b-92be-3341e5f59abb" (UID: "ff062d15-1ff3-4d8b-92be-3341e5f59abb"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:09.653731 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.653681 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff062d15-1ff3-4d8b-92be-3341e5f59abb-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ff062d15-1ff3-4d8b-92be-3341e5f59abb" (UID: "ff062d15-1ff3-4d8b-92be-3341e5f59abb"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:09.655378 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.655345 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "ff062d15-1ff3-4d8b-92be-3341e5f59abb" (UID: "ff062d15-1ff3-4d8b-92be-3341e5f59abb"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:09.655378 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.655351 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-kube-api-access-2fxtg" (OuterVolumeSpecName: "kube-api-access-2fxtg") pod "ff062d15-1ff3-4d8b-92be-3341e5f59abb" (UID: "ff062d15-1ff3-4d8b-92be-3341e5f59abb"). InnerVolumeSpecName "kube-api-access-2fxtg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:09.655583 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.655505 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff062d15-1ff3-4d8b-92be-3341e5f59abb-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "ff062d15-1ff3-4d8b-92be-3341e5f59abb" (UID: "ff062d15-1ff3-4d8b-92be-3341e5f59abb"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:18:09.655828 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.655799 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff062d15-1ff3-4d8b-92be-3341e5f59abb-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ff062d15-1ff3-4d8b-92be-3341e5f59abb" (UID: "ff062d15-1ff3-4d8b-92be-3341e5f59abb"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:18:09.656021 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.656000 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ff062d15-1ff3-4d8b-92be-3341e5f59abb" (UID: "ff062d15-1ff3-4d8b-92be-3341e5f59abb"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:09.659075 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.659050 2581 scope.go:117] "RemoveContainer" containerID="f6b26feb5555fdf68823f814316d1522177c787f78a7d8b0d6a3a262674ac958" Apr 24 21:18:09.659353 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:18:09.659335 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6b26feb5555fdf68823f814316d1522177c787f78a7d8b0d6a3a262674ac958\": container with ID starting with f6b26feb5555fdf68823f814316d1522177c787f78a7d8b0d6a3a262674ac958 not found: ID does not exist" containerID="f6b26feb5555fdf68823f814316d1522177c787f78a7d8b0d6a3a262674ac958" Apr 24 21:18:09.659407 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.659363 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b26feb5555fdf68823f814316d1522177c787f78a7d8b0d6a3a262674ac958"} err="failed to get container status \"f6b26feb5555fdf68823f814316d1522177c787f78a7d8b0d6a3a262674ac958\": rpc error: code = NotFound desc = could not find container \"f6b26feb5555fdf68823f814316d1522177c787f78a7d8b0d6a3a262674ac958\": container with ID starting with f6b26feb5555fdf68823f814316d1522177c787f78a7d8b0d6a3a262674ac958 not found: ID does not exist" Apr 24 21:18:09.661902 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.661879 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff062d15-1ff3-4d8b-92be-3341e5f59abb-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ff062d15-1ff3-4d8b-92be-3341e5f59abb" (UID: "ff062d15-1ff3-4d8b-92be-3341e5f59abb"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:18:09.753953 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.753927 2581 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ff062d15-1ff3-4d8b-92be-3341e5f59abb-ca-trust-extracted\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:18:09.753953 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.753952 2581 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-tls\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:18:09.754098 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.753962 2581 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ff062d15-1ff3-4d8b-92be-3341e5f59abb-installation-pull-secrets\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:18:09.754098 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.753971 2581 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-bound-sa-token\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:18:09.754098 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.753980 2581 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff062d15-1ff3-4d8b-92be-3341e5f59abb-trusted-ca\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:18:09.754098 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.753988 2581 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ff062d15-1ff3-4d8b-92be-3341e5f59abb-image-registry-private-configuration\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:18:09.754098 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.753997 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2fxtg\" (UniqueName: \"kubernetes.io/projected/ff062d15-1ff3-4d8b-92be-3341e5f59abb-kube-api-access-2fxtg\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:18:09.754098 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.754006 2581 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ff062d15-1ff3-4d8b-92be-3341e5f59abb-registry-certificates\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:18:09.964114 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.964081 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6c69778f6c-6k4sq"] Apr 24 21:18:09.967862 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:09.967833 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6c69778f6c-6k4sq"] Apr 24 21:18:11.874309 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:11.874276 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff062d15-1ff3-4d8b-92be-3341e5f59abb" path="/var/lib/kubelet/pods/ff062d15-1ff3-4d8b-92be-3341e5f59abb/volumes" Apr 24 21:18:13.545920 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:13.545876 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" podUID="d799708d-6592-4222-b0e7-a25a20dc584e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:18:13.663399 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:13.663364 2581 generic.go:358] "Generic (PLEG): container finished" podID="6039cd07-a35a-4794-af31-da75ea5a3fa6" containerID="33dba131003e42c603eea4a2f027428329e5efd358ba4b4fa1e96405aab65ba2" exitCode=0 Apr 24 21:18:13.663608 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:13.663418 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-k25j5" event={"ID":"6039cd07-a35a-4794-af31-da75ea5a3fa6","Type":"ContainerDied","Data":"33dba131003e42c603eea4a2f027428329e5efd358ba4b4fa1e96405aab65ba2"} Apr 24 21:18:13.663786 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:13.663771 2581 scope.go:117] "RemoveContainer" containerID="33dba131003e42c603eea4a2f027428329e5efd358ba4b4fa1e96405aab65ba2" Apr 24 21:18:14.668233 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:14.668198 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-k25j5" event={"ID":"6039cd07-a35a-4794-af31-da75ea5a3fa6","Type":"ContainerStarted","Data":"2a6a11b77b3ba4c39a4e81edd9f91594b10a3f07f42e2e918ac974b6ccf71f8e"} Apr 24 21:18:16.434574 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.434533 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-d4f99bf67-2gt8x" podUID="9ffae187-6cbb-4cdc-a65c-42f879939f6d" containerName="console" containerID="cri-o://786491737acba8faefc8bfafe45ad106c497f233564d0c932ca4542d3625dfdd" gracePeriod=15 Apr 24 21:18:16.676044 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.676020 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d4f99bf67-2gt8x_9ffae187-6cbb-4cdc-a65c-42f879939f6d/console/0.log" Apr 24 21:18:16.676191 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.676061 2581 generic.go:358] "Generic (PLEG): container finished" podID="9ffae187-6cbb-4cdc-a65c-42f879939f6d" containerID="786491737acba8faefc8bfafe45ad106c497f233564d0c932ca4542d3625dfdd" exitCode=2 Apr 24 21:18:16.676191 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.676137 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d4f99bf67-2gt8x" event={"ID":"9ffae187-6cbb-4cdc-a65c-42f879939f6d","Type":"ContainerDied","Data":"786491737acba8faefc8bfafe45ad106c497f233564d0c932ca4542d3625dfdd"} Apr 24 21:18:16.719684 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.719664 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d4f99bf67-2gt8x_9ffae187-6cbb-4cdc-a65c-42f879939f6d/console/0.log" Apr 24 21:18:16.719786 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.719726 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:18:16.819410 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.819380 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ffae187-6cbb-4cdc-a65c-42f879939f6d-console-serving-cert\") pod \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " Apr 24 21:18:16.819605 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.819493 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrwmd\" (UniqueName: \"kubernetes.io/projected/9ffae187-6cbb-4cdc-a65c-42f879939f6d-kube-api-access-qrwmd\") pod \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " Apr 24 21:18:16.819605 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.819527 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ffae187-6cbb-4cdc-a65c-42f879939f6d-service-ca\") pod \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " Apr 24 21:18:16.819605 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.819543 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ffae187-6cbb-4cdc-a65c-42f879939f6d-console-config\") pod \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " Apr 24 21:18:16.819605 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.819563 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ffae187-6cbb-4cdc-a65c-42f879939f6d-console-oauth-config\") pod \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " Apr 24 21:18:16.819605 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.819586 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ffae187-6cbb-4cdc-a65c-42f879939f6d-oauth-serving-cert\") pod \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\" (UID: \"9ffae187-6cbb-4cdc-a65c-42f879939f6d\") " Apr 24 21:18:16.819996 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.819964 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ffae187-6cbb-4cdc-a65c-42f879939f6d-service-ca" (OuterVolumeSpecName: "service-ca") pod "9ffae187-6cbb-4cdc-a65c-42f879939f6d" (UID: "9ffae187-6cbb-4cdc-a65c-42f879939f6d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:16.820115 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.820015 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ffae187-6cbb-4cdc-a65c-42f879939f6d-console-config" (OuterVolumeSpecName: "console-config") pod "9ffae187-6cbb-4cdc-a65c-42f879939f6d" (UID: "9ffae187-6cbb-4cdc-a65c-42f879939f6d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:16.820115 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.820044 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ffae187-6cbb-4cdc-a65c-42f879939f6d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9ffae187-6cbb-4cdc-a65c-42f879939f6d" (UID: "9ffae187-6cbb-4cdc-a65c-42f879939f6d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:16.821758 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.821735 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ffae187-6cbb-4cdc-a65c-42f879939f6d-kube-api-access-qrwmd" (OuterVolumeSpecName: "kube-api-access-qrwmd") pod "9ffae187-6cbb-4cdc-a65c-42f879939f6d" (UID: "9ffae187-6cbb-4cdc-a65c-42f879939f6d"). InnerVolumeSpecName "kube-api-access-qrwmd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:16.822020 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.822003 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ffae187-6cbb-4cdc-a65c-42f879939f6d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9ffae187-6cbb-4cdc-a65c-42f879939f6d" (UID: "9ffae187-6cbb-4cdc-a65c-42f879939f6d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:18:16.822071 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.822032 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ffae187-6cbb-4cdc-a65c-42f879939f6d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9ffae187-6cbb-4cdc-a65c-42f879939f6d" (UID: "9ffae187-6cbb-4cdc-a65c-42f879939f6d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:18:16.920471 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.920411 2581 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ffae187-6cbb-4cdc-a65c-42f879939f6d-service-ca\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:18:16.920471 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.920471 2581 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ffae187-6cbb-4cdc-a65c-42f879939f6d-console-config\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:18:16.920650 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.920487 2581 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ffae187-6cbb-4cdc-a65c-42f879939f6d-console-oauth-config\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:18:16.920650 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.920503 2581 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ffae187-6cbb-4cdc-a65c-42f879939f6d-oauth-serving-cert\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:18:16.920650 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.920516 2581 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ffae187-6cbb-4cdc-a65c-42f879939f6d-console-serving-cert\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:18:16.920650 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:16.920530 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qrwmd\" (UniqueName: \"kubernetes.io/projected/9ffae187-6cbb-4cdc-a65c-42f879939f6d-kube-api-access-qrwmd\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:18:17.685737 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:17.685708 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d4f99bf67-2gt8x_9ffae187-6cbb-4cdc-a65c-42f879939f6d/console/0.log" Apr 24 21:18:17.686192 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:17.685757 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d4f99bf67-2gt8x" event={"ID":"9ffae187-6cbb-4cdc-a65c-42f879939f6d","Type":"ContainerDied","Data":"73c44fffe8758501123416d61472579d80592def64e305280657cde2b7ee69fa"} Apr 24 21:18:17.686192 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:17.685791 2581 scope.go:117] "RemoveContainer" containerID="786491737acba8faefc8bfafe45ad106c497f233564d0c932ca4542d3625dfdd" Apr 24 21:18:17.686192 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:17.685843 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d4f99bf67-2gt8x" Apr 24 21:18:17.707083 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:17.707061 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d4f99bf67-2gt8x"] Apr 24 21:18:17.709904 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:17.709881 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-d4f99bf67-2gt8x"] Apr 24 21:18:17.873889 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:17.873861 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ffae187-6cbb-4cdc-a65c-42f879939f6d" path="/var/lib/kubelet/pods/9ffae187-6cbb-4cdc-a65c-42f879939f6d/volumes" Apr 24 21:18:18.689794 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:18.689755 2581 generic.go:358] "Generic (PLEG): container finished" podID="d1b7bcd1-e58f-42c3-9a78-a06df4ff2253" containerID="389c82caffb0d8370bbc1f5b36a010a444dba0dbe1213457688f8ed14be5146b" exitCode=0 Apr 24 21:18:18.690214 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:18.689831 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-xfzqb" event={"ID":"d1b7bcd1-e58f-42c3-9a78-a06df4ff2253","Type":"ContainerDied","Data":"389c82caffb0d8370bbc1f5b36a010a444dba0dbe1213457688f8ed14be5146b"} Apr 24 21:18:18.690214 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:18.690201 2581 scope.go:117] "RemoveContainer" containerID="389c82caffb0d8370bbc1f5b36a010a444dba0dbe1213457688f8ed14be5146b" Apr 24 21:18:19.695691 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:19.695656 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-xfzqb" event={"ID":"d1b7bcd1-e58f-42c3-9a78-a06df4ff2253","Type":"ContainerStarted","Data":"1bacaa8c22c9b79a5d4a0cb9e5fa9f7599b02236b8c2633602b649cc71905303"} Apr 24 21:18:23.546595 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:23.546553 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" podUID="d799708d-6592-4222-b0e7-a25a20dc584e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:18:33.546193 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:33.546155 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" podUID="d799708d-6592-4222-b0e7-a25a20dc584e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:18:33.546731 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:33.546241 2581 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" Apr 24 21:18:33.546792 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:33.546729 2581 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"4e669f475ba2ca6610fc0d74cf3edd5550ad9e46f9f0e0799799f0f5539e3e46"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 21:18:33.546792 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:33.546767 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" podUID="d799708d-6592-4222-b0e7-a25a20dc584e" containerName="service-proxy" containerID="cri-o://4e669f475ba2ca6610fc0d74cf3edd5550ad9e46f9f0e0799799f0f5539e3e46" gracePeriod=30 Apr 24 21:18:33.742640 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:33.742610 2581 generic.go:358] "Generic (PLEG): container finished" podID="d799708d-6592-4222-b0e7-a25a20dc584e" containerID="4e669f475ba2ca6610fc0d74cf3edd5550ad9e46f9f0e0799799f0f5539e3e46" exitCode=2 Apr 24 21:18:33.742777 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:33.742684 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" event={"ID":"d799708d-6592-4222-b0e7-a25a20dc584e","Type":"ContainerDied","Data":"4e669f475ba2ca6610fc0d74cf3edd5550ad9e46f9f0e0799799f0f5539e3e46"} Apr 24 21:18:33.742777 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:18:33.742725 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fb9cf4bb8-mwt58" event={"ID":"d799708d-6592-4222-b0e7-a25a20dc584e","Type":"ContainerStarted","Data":"7e73864caaf178c17aeaf20f47da157a42dbfcd655648221e79717db249eab3d"} Apr 24 21:20:59.787438 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:20:59.787387 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/1.log" Apr 24 21:20:59.793311 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:20:59.793275 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 21:20:59.793479 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:20:59.793398 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/1.log" Apr 24 21:20:59.796878 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:20:59.796862 2581 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:20:59.798866 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:20:59.798848 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 21:22:37.631591 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:37.631560 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-fwk7d"] Apr 24 21:22:37.632027 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:37.631906 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d77d521f-bac2-47f1-80bd-1a4c7f08c799" containerName="registry" Apr 24 21:22:37.632027 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:37.631919 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="d77d521f-bac2-47f1-80bd-1a4c7f08c799" containerName="registry" Apr 24 21:22:37.632027 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:37.631935 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff062d15-1ff3-4d8b-92be-3341e5f59abb" containerName="registry" Apr 24 21:22:37.632027 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:37.631941 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff062d15-1ff3-4d8b-92be-3341e5f59abb" containerName="registry" Apr 24 21:22:37.632027 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:37.631950 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ffae187-6cbb-4cdc-a65c-42f879939f6d" containerName="console" Apr 24 21:22:37.632027 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:37.631955 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ffae187-6cbb-4cdc-a65c-42f879939f6d" containerName="console" Apr 24 21:22:37.632027 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:37.632003 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="d77d521f-bac2-47f1-80bd-1a4c7f08c799" containerName="registry" Apr 24 21:22:37.632027 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:37.632013 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ffae187-6cbb-4cdc-a65c-42f879939f6d" containerName="console" Apr 24 21:22:37.632027 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:37.632020 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff062d15-1ff3-4d8b-92be-3341e5f59abb" containerName="registry" Apr 24 21:22:37.634998 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:37.634981 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-fwk7d" Apr 24 21:22:37.637548 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:37.637522 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 24 21:22:37.637667 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:37.637559 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-6gwkk\"" Apr 24 21:22:37.637667 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:37.637570 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 24 21:22:37.647016 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:37.646995 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-fwk7d"] Apr 24 21:22:37.724634 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:37.724607 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjlsz\" (UniqueName: \"kubernetes.io/projected/cc3a4977-a281-476c-9c01-d4f1e4963cf3-kube-api-access-fjlsz\") pod \"servicemesh-operator3-55f49c5f94-fwk7d\" (UID: \"cc3a4977-a281-476c-9c01-d4f1e4963cf3\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-fwk7d" Apr 24 21:22:37.724785 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:37.724651 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/cc3a4977-a281-476c-9c01-d4f1e4963cf3-operator-config\") pod \"servicemesh-operator3-55f49c5f94-fwk7d\" (UID: \"cc3a4977-a281-476c-9c01-d4f1e4963cf3\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-fwk7d" Apr 24 21:22:37.825606 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:37.825585 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/cc3a4977-a281-476c-9c01-d4f1e4963cf3-operator-config\") pod \"servicemesh-operator3-55f49c5f94-fwk7d\" (UID: \"cc3a4977-a281-476c-9c01-d4f1e4963cf3\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-fwk7d" Apr 24 21:22:37.825739 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:37.825640 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjlsz\" (UniqueName: \"kubernetes.io/projected/cc3a4977-a281-476c-9c01-d4f1e4963cf3-kube-api-access-fjlsz\") pod \"servicemesh-operator3-55f49c5f94-fwk7d\" (UID: \"cc3a4977-a281-476c-9c01-d4f1e4963cf3\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-fwk7d" Apr 24 21:22:37.827990 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:37.827973 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/cc3a4977-a281-476c-9c01-d4f1e4963cf3-operator-config\") pod \"servicemesh-operator3-55f49c5f94-fwk7d\" (UID: \"cc3a4977-a281-476c-9c01-d4f1e4963cf3\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-fwk7d" Apr 24 21:22:37.834698 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:37.834677 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjlsz\" (UniqueName: \"kubernetes.io/projected/cc3a4977-a281-476c-9c01-d4f1e4963cf3-kube-api-access-fjlsz\") pod \"servicemesh-operator3-55f49c5f94-fwk7d\" (UID: \"cc3a4977-a281-476c-9c01-d4f1e4963cf3\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-fwk7d" Apr 24 21:22:37.944153 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:37.944130 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-fwk7d" Apr 24 21:22:38.298535 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:38.298455 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-fwk7d"] Apr 24 21:22:38.301846 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:22:38.301818 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc3a4977_a281_476c_9c01_d4f1e4963cf3.slice/crio-44deac746aadb78f26ead71601443542552899ce00219716db6b02a08805aceb WatchSource:0}: Error finding container 44deac746aadb78f26ead71601443542552899ce00219716db6b02a08805aceb: Status 404 returned error can't find the container with id 44deac746aadb78f26ead71601443542552899ce00219716db6b02a08805aceb Apr 24 21:22:38.304289 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:38.304271 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:22:38.442911 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:38.442879 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-fwk7d" event={"ID":"cc3a4977-a281-476c-9c01-d4f1e4963cf3","Type":"ContainerStarted","Data":"44deac746aadb78f26ead71601443542552899ce00219716db6b02a08805aceb"} Apr 24 21:22:41.455651 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:41.455621 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-fwk7d" event={"ID":"cc3a4977-a281-476c-9c01-d4f1e4963cf3","Type":"ContainerStarted","Data":"7837c9ebcdd9878abc9ad8d8499476073100728f58b9f6f47d03fd3f048c1c65"} Apr 24 21:22:41.456044 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:41.455764 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-fwk7d" Apr 24 21:22:41.477163 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:41.477107 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-fwk7d" podStartSLOduration=1.809584866 podStartE2EDuration="4.477092766s" podCreationTimestamp="2026-04-24 21:22:37 +0000 UTC" firstStartedPulling="2026-04-24 21:22:38.304438731 +0000 UTC m=+398.983275381" lastFinishedPulling="2026-04-24 21:22:40.97194663 +0000 UTC m=+401.650783281" observedRunningTime="2026-04-24 21:22:41.475146553 +0000 UTC m=+402.153983234" watchObservedRunningTime="2026-04-24 21:22:41.477092766 +0000 UTC m=+402.155929439" Apr 24 21:22:44.134650 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.134616 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694"] Apr 24 21:22:44.137898 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.137882 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.140492 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.140470 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 24 21:22:44.140666 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.140497 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 24 21:22:44.140666 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.140516 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-8d7mz\"" Apr 24 21:22:44.140666 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.140471 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 24 21:22:44.141076 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.141058 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 24 21:22:44.151454 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.151411 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694"] Apr 24 21:22:44.278238 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.278206 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8d9w\" (UniqueName: \"kubernetes.io/projected/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-kube-api-access-m8d9w\") pod \"istiod-openshift-gateway-7cd77c7ffd-7z694\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.278238 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.278238 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-7z694\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.278478 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.278259 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-7z694\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.278478 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.278287 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-7z694\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.278478 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.278302 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-7z694\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.278478 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.278328 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-7z694\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.278478 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.278349 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-7z694\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.379218 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.379184 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-7z694\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.379218 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.379217 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-7z694\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.379482 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.379239 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-7z694\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.379482 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.379270 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-7z694\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.379482 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.379361 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8d9w\" (UniqueName: \"kubernetes.io/projected/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-kube-api-access-m8d9w\") pod \"istiod-openshift-gateway-7cd77c7ffd-7z694\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.379482 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.379389 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-7z694\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.379482 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.379415 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-7z694\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.380096 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.380072 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-7z694\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.381697 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.381675 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-7z694\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.381794 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.381717 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-7z694\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.381952 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.381933 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-7z694\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.381989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.381969 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-7z694\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.387197 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.387152 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-7z694\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.387483 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.387466 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8d9w\" (UniqueName: \"kubernetes.io/projected/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-kube-api-access-m8d9w\") pod \"istiod-openshift-gateway-7cd77c7ffd-7z694\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.449587 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.449555 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:44.583184 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:44.583158 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694"] Apr 24 21:22:44.585304 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:22:44.585270 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dd0fa2b_28ef_4aff_9929_e03212b3f28b.slice/crio-19db6937aad8465d0b99fc96cbb9e60f22e6f046ec38ef21fc80db4317d2fb4e WatchSource:0}: Error finding container 19db6937aad8465d0b99fc96cbb9e60f22e6f046ec38ef21fc80db4317d2fb4e: Status 404 returned error can't find the container with id 19db6937aad8465d0b99fc96cbb9e60f22e6f046ec38ef21fc80db4317d2fb4e Apr 24 21:22:45.477704 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:45.474933 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" event={"ID":"1dd0fa2b-28ef-4aff-9929-e03212b3f28b","Type":"ContainerStarted","Data":"19db6937aad8465d0b99fc96cbb9e60f22e6f046ec38ef21fc80db4317d2fb4e"} Apr 24 21:22:47.004011 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:47.003803 2581 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 24 21:22:47.004011 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:47.003941 2581 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 24 21:22:47.483668 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:47.483634 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" event={"ID":"1dd0fa2b-28ef-4aff-9929-e03212b3f28b","Type":"ContainerStarted","Data":"f17c674785849db6f428d9261e21e08c56a10b4d42d085fef48aaa366bc20875"} Apr 24 21:22:47.483834 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:47.483757 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:47.507836 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:47.507787 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" podStartSLOduration=1.091492407 podStartE2EDuration="3.507773406s" podCreationTimestamp="2026-04-24 21:22:44 +0000 UTC" firstStartedPulling="2026-04-24 21:22:44.587303386 +0000 UTC m=+405.266140036" lastFinishedPulling="2026-04-24 21:22:47.003584381 +0000 UTC m=+407.682421035" observedRunningTime="2026-04-24 21:22:47.506112622 +0000 UTC m=+408.184949317" watchObservedRunningTime="2026-04-24 21:22:47.507773406 +0000 UTC m=+408.186610077" Apr 24 21:22:48.489601 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:48.489572 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:22:52.464230 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:22:52.464199 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-fwk7d" Apr 24 21:23:11.728112 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:11.728074 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-9lv9j"] Apr 24 21:23:11.734289 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:11.734260 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-9lv9j" Apr 24 21:23:11.737334 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:11.737317 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 24 21:23:11.737976 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:11.737959 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 24 21:23:11.738054 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:11.737961 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-2t9px\"" Apr 24 21:23:11.743251 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:11.743224 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-9lv9j"] Apr 24 21:23:11.797754 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:11.797726 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb7j2\" (UniqueName: \"kubernetes.io/projected/8c7e8e7c-0861-4331-996f-ff602b50ef8b-kube-api-access-fb7j2\") pod \"limitador-operator-controller-manager-c7fb4c8d5-9lv9j\" (UID: \"8c7e8e7c-0861-4331-996f-ff602b50ef8b\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-9lv9j" Apr 24 21:23:11.898618 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:11.898590 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fb7j2\" (UniqueName: \"kubernetes.io/projected/8c7e8e7c-0861-4331-996f-ff602b50ef8b-kube-api-access-fb7j2\") pod \"limitador-operator-controller-manager-c7fb4c8d5-9lv9j\" (UID: \"8c7e8e7c-0861-4331-996f-ff602b50ef8b\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-9lv9j" Apr 24 21:23:11.906505 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:11.906483 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb7j2\" (UniqueName: \"kubernetes.io/projected/8c7e8e7c-0861-4331-996f-ff602b50ef8b-kube-api-access-fb7j2\") pod \"limitador-operator-controller-manager-c7fb4c8d5-9lv9j\" (UID: \"8c7e8e7c-0861-4331-996f-ff602b50ef8b\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-9lv9j" Apr 24 21:23:12.045656 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:12.045579 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-9lv9j" Apr 24 21:23:12.382103 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:12.382078 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-9lv9j"] Apr 24 21:23:12.384475 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:23:12.384447 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c7e8e7c_0861_4331_996f_ff602b50ef8b.slice/crio-e5a92046819354cbd9c8502f8ea28137dea0e7aa9bb686e6b1b4be693a68bcd3 WatchSource:0}: Error finding container e5a92046819354cbd9c8502f8ea28137dea0e7aa9bb686e6b1b4be693a68bcd3: Status 404 returned error can't find the container with id e5a92046819354cbd9c8502f8ea28137dea0e7aa9bb686e6b1b4be693a68bcd3 Apr 24 21:23:12.570122 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:12.570089 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-9lv9j" event={"ID":"8c7e8e7c-0861-4331-996f-ff602b50ef8b","Type":"ContainerStarted","Data":"e5a92046819354cbd9c8502f8ea28137dea0e7aa9bb686e6b1b4be693a68bcd3"} Apr 24 21:23:15.582118 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:15.582037 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-9lv9j" event={"ID":"8c7e8e7c-0861-4331-996f-ff602b50ef8b","Type":"ContainerStarted","Data":"595e59ef1b21325eeb99f3b82748b23e0a10443cc49e7c3b7d5bcaf9d5e939ca"} Apr 24 21:23:15.582496 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:15.582189 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-9lv9j" Apr 24 21:23:15.602171 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:15.602121 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-9lv9j" podStartSLOduration=1.689286047 podStartE2EDuration="4.60210708s" podCreationTimestamp="2026-04-24 21:23:11 +0000 UTC" firstStartedPulling="2026-04-24 21:23:12.386988588 +0000 UTC m=+433.065825238" lastFinishedPulling="2026-04-24 21:23:15.299809582 +0000 UTC m=+435.978646271" observedRunningTime="2026-04-24 21:23:15.600822228 +0000 UTC m=+436.279658901" watchObservedRunningTime="2026-04-24 21:23:15.60210708 +0000 UTC m=+436.280943751" Apr 24 21:23:17.604766 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:17.604735 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-cshcc"] Apr 24 21:23:17.608158 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:17.608138 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-cshcc" Apr 24 21:23:17.611031 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:17.611011 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-wjllt\"" Apr 24 21:23:17.626316 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:17.626296 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-cshcc"] Apr 24 21:23:17.649053 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:17.649023 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f6dcc8a9-8a3d-435a-8db9-70fc2b938bf5-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-cshcc\" (UID: \"f6dcc8a9-8a3d-435a-8db9-70fc2b938bf5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-cshcc" Apr 24 21:23:17.649155 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:17.649065 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgmsw\" (UniqueName: \"kubernetes.io/projected/f6dcc8a9-8a3d-435a-8db9-70fc2b938bf5-kube-api-access-qgmsw\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-cshcc\" (UID: \"f6dcc8a9-8a3d-435a-8db9-70fc2b938bf5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-cshcc" Apr 24 21:23:17.749624 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:17.749596 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgmsw\" (UniqueName: \"kubernetes.io/projected/f6dcc8a9-8a3d-435a-8db9-70fc2b938bf5-kube-api-access-qgmsw\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-cshcc\" (UID: \"f6dcc8a9-8a3d-435a-8db9-70fc2b938bf5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-cshcc" Apr 24 21:23:17.749747 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:17.749672 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f6dcc8a9-8a3d-435a-8db9-70fc2b938bf5-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-cshcc\" (UID: \"f6dcc8a9-8a3d-435a-8db9-70fc2b938bf5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-cshcc" Apr 24 21:23:17.750015 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:17.749998 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f6dcc8a9-8a3d-435a-8db9-70fc2b938bf5-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-cshcc\" (UID: \"f6dcc8a9-8a3d-435a-8db9-70fc2b938bf5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-cshcc" Apr 24 21:23:17.757624 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:17.757604 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgmsw\" (UniqueName: \"kubernetes.io/projected/f6dcc8a9-8a3d-435a-8db9-70fc2b938bf5-kube-api-access-qgmsw\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-cshcc\" (UID: \"f6dcc8a9-8a3d-435a-8db9-70fc2b938bf5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-cshcc" Apr 24 21:23:17.918122 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:17.918051 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-cshcc" Apr 24 21:23:18.049444 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:18.049396 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-cshcc"] Apr 24 21:23:18.053371 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:23:18.053335 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6dcc8a9_8a3d_435a_8db9_70fc2b938bf5.slice/crio-7f413a53c08f6f695dfca03f9eda1ed4b957925fd76723deda0735f21b42e8ff WatchSource:0}: Error finding container 7f413a53c08f6f695dfca03f9eda1ed4b957925fd76723deda0735f21b42e8ff: Status 404 returned error can't find the container with id 7f413a53c08f6f695dfca03f9eda1ed4b957925fd76723deda0735f21b42e8ff Apr 24 21:23:18.593005 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:18.592973 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-cshcc" event={"ID":"f6dcc8a9-8a3d-435a-8db9-70fc2b938bf5","Type":"ContainerStarted","Data":"7f413a53c08f6f695dfca03f9eda1ed4b957925fd76723deda0735f21b42e8ff"} Apr 24 21:23:22.607997 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:22.607953 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-cshcc" event={"ID":"f6dcc8a9-8a3d-435a-8db9-70fc2b938bf5","Type":"ContainerStarted","Data":"9853c3d5f5f02ae9abae8947825669955dde7af759cc64eda3052e77b1c7256d"} Apr 24 21:23:22.608459 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:22.608079 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-cshcc" Apr 24 21:23:22.649677 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:22.649615 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-cshcc" podStartSLOduration=1.876753833 podStartE2EDuration="5.649596704s" podCreationTimestamp="2026-04-24 21:23:17 +0000 UTC" firstStartedPulling="2026-04-24 21:23:18.055897506 +0000 UTC m=+438.734734163" lastFinishedPulling="2026-04-24 21:23:21.828740384 +0000 UTC m=+442.507577034" observedRunningTime="2026-04-24 21:23:22.649100369 +0000 UTC m=+443.327937041" watchObservedRunningTime="2026-04-24 21:23:22.649596704 +0000 UTC m=+443.328433381" Apr 24 21:23:26.587662 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:26.587633 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-9lv9j" Apr 24 21:23:33.612657 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:23:33.612626 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-cshcc" Apr 24 21:24:08.346959 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:08.346917 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jwvdj"] Apr 24 21:24:08.354380 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:08.354356 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-jwvdj" Apr 24 21:24:08.357208 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:08.357181 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 24 21:24:08.357505 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:08.357487 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-pqks9\"" Apr 24 21:24:08.358472 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:08.358452 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jwvdj"] Apr 24 21:24:08.444453 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:08.444410 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jwvdj"] Apr 24 21:24:08.452488 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:08.452462 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/0bc97aed-da51-45fa-9896-ad3472bc94d8-config-file\") pod \"limitador-limitador-64c8f475fb-jwvdj\" (UID: \"0bc97aed-da51-45fa-9896-ad3472bc94d8\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jwvdj" Apr 24 21:24:08.452613 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:08.452510 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mp62\" (UniqueName: \"kubernetes.io/projected/0bc97aed-da51-45fa-9896-ad3472bc94d8-kube-api-access-7mp62\") pod \"limitador-limitador-64c8f475fb-jwvdj\" (UID: \"0bc97aed-da51-45fa-9896-ad3472bc94d8\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jwvdj" Apr 24 21:24:08.553723 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:08.553692 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/0bc97aed-da51-45fa-9896-ad3472bc94d8-config-file\") pod \"limitador-limitador-64c8f475fb-jwvdj\" (UID: \"0bc97aed-da51-45fa-9896-ad3472bc94d8\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jwvdj" Apr 24 21:24:08.553870 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:08.553733 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mp62\" (UniqueName: \"kubernetes.io/projected/0bc97aed-da51-45fa-9896-ad3472bc94d8-kube-api-access-7mp62\") pod \"limitador-limitador-64c8f475fb-jwvdj\" (UID: \"0bc97aed-da51-45fa-9896-ad3472bc94d8\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jwvdj" Apr 24 21:24:08.554295 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:08.554266 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/0bc97aed-da51-45fa-9896-ad3472bc94d8-config-file\") pod \"limitador-limitador-64c8f475fb-jwvdj\" (UID: \"0bc97aed-da51-45fa-9896-ad3472bc94d8\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jwvdj" Apr 24 21:24:08.568117 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:08.568091 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mp62\" (UniqueName: \"kubernetes.io/projected/0bc97aed-da51-45fa-9896-ad3472bc94d8-kube-api-access-7mp62\") pod \"limitador-limitador-64c8f475fb-jwvdj\" (UID: \"0bc97aed-da51-45fa-9896-ad3472bc94d8\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jwvdj" Apr 24 21:24:08.666112 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:08.666033 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-jwvdj" Apr 24 21:24:08.791068 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:08.790955 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jwvdj"] Apr 24 21:24:08.793473 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:24:08.793441 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bc97aed_da51_45fa_9896_ad3472bc94d8.slice/crio-e0043d1dfa52efcb2567a245dd80e8be8ef1d2ed65e7a8b1bdca7ff6988aa8c7 WatchSource:0}: Error finding container e0043d1dfa52efcb2567a245dd80e8be8ef1d2ed65e7a8b1bdca7ff6988aa8c7: Status 404 returned error can't find the container with id e0043d1dfa52efcb2567a245dd80e8be8ef1d2ed65e7a8b1bdca7ff6988aa8c7 Apr 24 21:24:09.768655 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:09.768621 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-jwvdj" event={"ID":"0bc97aed-da51-45fa-9896-ad3472bc94d8","Type":"ContainerStarted","Data":"e0043d1dfa52efcb2567a245dd80e8be8ef1d2ed65e7a8b1bdca7ff6988aa8c7"} Apr 24 21:24:12.779956 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:12.779918 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-jwvdj" event={"ID":"0bc97aed-da51-45fa-9896-ad3472bc94d8","Type":"ContainerStarted","Data":"3a46db38ed93a79fefb2e41314d0167fe09635256d24977a286c475b3b4be72c"} Apr 24 21:24:12.780285 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:12.780069 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-jwvdj" Apr 24 21:24:12.797098 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:12.797055 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-jwvdj" podStartSLOduration=0.919969844 podStartE2EDuration="4.797044349s" podCreationTimestamp="2026-04-24 21:24:08 +0000 UTC" firstStartedPulling="2026-04-24 21:24:08.795204244 +0000 UTC m=+489.474040894" lastFinishedPulling="2026-04-24 21:24:12.672278739 +0000 UTC m=+493.351115399" observedRunningTime="2026-04-24 21:24:12.79612028 +0000 UTC m=+493.474956946" watchObservedRunningTime="2026-04-24 21:24:12.797044349 +0000 UTC m=+493.475881055" Apr 24 21:24:23.784924 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:23.784887 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-jwvdj" Apr 24 21:24:24.662030 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:24.661995 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jwvdj"] Apr 24 21:24:24.662255 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:24.662212 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-jwvdj" podUID="0bc97aed-da51-45fa-9896-ad3472bc94d8" containerName="limitador" containerID="cri-o://3a46db38ed93a79fefb2e41314d0167fe09635256d24977a286c475b3b4be72c" gracePeriod=30 Apr 24 21:24:25.208691 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:25.208669 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-jwvdj" Apr 24 21:24:25.291217 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:25.291151 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mp62\" (UniqueName: \"kubernetes.io/projected/0bc97aed-da51-45fa-9896-ad3472bc94d8-kube-api-access-7mp62\") pod \"0bc97aed-da51-45fa-9896-ad3472bc94d8\" (UID: \"0bc97aed-da51-45fa-9896-ad3472bc94d8\") " Apr 24 21:24:25.291359 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:25.291223 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/0bc97aed-da51-45fa-9896-ad3472bc94d8-config-file\") pod \"0bc97aed-da51-45fa-9896-ad3472bc94d8\" (UID: \"0bc97aed-da51-45fa-9896-ad3472bc94d8\") " Apr 24 21:24:25.291559 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:25.291534 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bc97aed-da51-45fa-9896-ad3472bc94d8-config-file" (OuterVolumeSpecName: "config-file") pod "0bc97aed-da51-45fa-9896-ad3472bc94d8" (UID: "0bc97aed-da51-45fa-9896-ad3472bc94d8"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:24:25.293292 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:25.293270 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bc97aed-da51-45fa-9896-ad3472bc94d8-kube-api-access-7mp62" (OuterVolumeSpecName: "kube-api-access-7mp62") pod "0bc97aed-da51-45fa-9896-ad3472bc94d8" (UID: "0bc97aed-da51-45fa-9896-ad3472bc94d8"). InnerVolumeSpecName "kube-api-access-7mp62". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:24:25.392640 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:25.392611 2581 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/0bc97aed-da51-45fa-9896-ad3472bc94d8-config-file\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:24:25.392640 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:25.392637 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7mp62\" (UniqueName: \"kubernetes.io/projected/0bc97aed-da51-45fa-9896-ad3472bc94d8-kube-api-access-7mp62\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:24:25.824976 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:25.824938 2581 generic.go:358] "Generic (PLEG): container finished" podID="0bc97aed-da51-45fa-9896-ad3472bc94d8" containerID="3a46db38ed93a79fefb2e41314d0167fe09635256d24977a286c475b3b4be72c" exitCode=0 Apr 24 21:24:25.825185 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:25.825001 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-jwvdj" Apr 24 21:24:25.825185 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:25.825029 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-jwvdj" event={"ID":"0bc97aed-da51-45fa-9896-ad3472bc94d8","Type":"ContainerDied","Data":"3a46db38ed93a79fefb2e41314d0167fe09635256d24977a286c475b3b4be72c"} Apr 24 21:24:25.825185 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:25.825077 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-jwvdj" event={"ID":"0bc97aed-da51-45fa-9896-ad3472bc94d8","Type":"ContainerDied","Data":"e0043d1dfa52efcb2567a245dd80e8be8ef1d2ed65e7a8b1bdca7ff6988aa8c7"} Apr 24 21:24:25.825185 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:25.825099 2581 scope.go:117] "RemoveContainer" containerID="3a46db38ed93a79fefb2e41314d0167fe09635256d24977a286c475b3b4be72c" Apr 24 21:24:25.833017 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:25.832998 2581 scope.go:117] "RemoveContainer" containerID="3a46db38ed93a79fefb2e41314d0167fe09635256d24977a286c475b3b4be72c" Apr 24 21:24:25.833273 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:24:25.833253 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a46db38ed93a79fefb2e41314d0167fe09635256d24977a286c475b3b4be72c\": container with ID starting with 3a46db38ed93a79fefb2e41314d0167fe09635256d24977a286c475b3b4be72c not found: ID does not exist" containerID="3a46db38ed93a79fefb2e41314d0167fe09635256d24977a286c475b3b4be72c" Apr 24 21:24:25.833324 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:25.833280 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a46db38ed93a79fefb2e41314d0167fe09635256d24977a286c475b3b4be72c"} err="failed to get container status \"3a46db38ed93a79fefb2e41314d0167fe09635256d24977a286c475b3b4be72c\": rpc error: code = NotFound desc = could not find container \"3a46db38ed93a79fefb2e41314d0167fe09635256d24977a286c475b3b4be72c\": container with ID starting with 3a46db38ed93a79fefb2e41314d0167fe09635256d24977a286c475b3b4be72c not found: ID does not exist" Apr 24 21:24:25.845880 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:25.845857 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jwvdj"] Apr 24 21:24:25.849668 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:25.849645 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jwvdj"] Apr 24 21:24:25.873823 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:25.873792 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bc97aed-da51-45fa-9896-ad3472bc94d8" path="/var/lib/kubelet/pods/0bc97aed-da51-45fa-9896-ad3472bc94d8/volumes" Apr 24 21:24:43.654181 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.654145 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49"] Apr 24 21:24:43.654637 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.654487 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0bc97aed-da51-45fa-9896-ad3472bc94d8" containerName="limitador" Apr 24 21:24:43.654637 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.654499 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bc97aed-da51-45fa-9896-ad3472bc94d8" containerName="limitador" Apr 24 21:24:43.654637 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.654565 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="0bc97aed-da51-45fa-9896-ad3472bc94d8" containerName="limitador" Apr 24 21:24:43.659005 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.658975 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:43.672797 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.672775 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49"] Apr 24 21:24:43.736172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.736144 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/293aaa20-5463-4bbc-be4b-fa5c379dae0c-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-rgd49\" (UID: \"293aaa20-5463-4bbc-be4b-fa5c379dae0c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:43.736350 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.736176 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/293aaa20-5463-4bbc-be4b-fa5c379dae0c-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-rgd49\" (UID: \"293aaa20-5463-4bbc-be4b-fa5c379dae0c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:43.736350 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.736206 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/293aaa20-5463-4bbc-be4b-fa5c379dae0c-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-rgd49\" (UID: \"293aaa20-5463-4bbc-be4b-fa5c379dae0c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:43.736350 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.736258 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/293aaa20-5463-4bbc-be4b-fa5c379dae0c-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-rgd49\" (UID: \"293aaa20-5463-4bbc-be4b-fa5c379dae0c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:43.736350 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.736291 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/293aaa20-5463-4bbc-be4b-fa5c379dae0c-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-rgd49\" (UID: \"293aaa20-5463-4bbc-be4b-fa5c379dae0c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:43.736536 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.736357 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr68b\" (UniqueName: \"kubernetes.io/projected/293aaa20-5463-4bbc-be4b-fa5c379dae0c-kube-api-access-fr68b\") pod \"istiod-openshift-gateway-55ff986f96-rgd49\" (UID: \"293aaa20-5463-4bbc-be4b-fa5c379dae0c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:43.736536 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.736407 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/293aaa20-5463-4bbc-be4b-fa5c379dae0c-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-rgd49\" (UID: \"293aaa20-5463-4bbc-be4b-fa5c379dae0c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:43.836999 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.836961 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/293aaa20-5463-4bbc-be4b-fa5c379dae0c-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-rgd49\" (UID: \"293aaa20-5463-4bbc-be4b-fa5c379dae0c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:43.837180 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.837014 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/293aaa20-5463-4bbc-be4b-fa5c379dae0c-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-rgd49\" (UID: \"293aaa20-5463-4bbc-be4b-fa5c379dae0c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:43.837180 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.837131 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/293aaa20-5463-4bbc-be4b-fa5c379dae0c-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-rgd49\" (UID: \"293aaa20-5463-4bbc-be4b-fa5c379dae0c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:43.837180 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.837170 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/293aaa20-5463-4bbc-be4b-fa5c379dae0c-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-rgd49\" (UID: \"293aaa20-5463-4bbc-be4b-fa5c379dae0c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:43.837343 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.837192 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/293aaa20-5463-4bbc-be4b-fa5c379dae0c-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-rgd49\" (UID: \"293aaa20-5463-4bbc-be4b-fa5c379dae0c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:43.837343 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.837228 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/293aaa20-5463-4bbc-be4b-fa5c379dae0c-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-rgd49\" (UID: \"293aaa20-5463-4bbc-be4b-fa5c379dae0c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:43.837343 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.837277 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fr68b\" (UniqueName: \"kubernetes.io/projected/293aaa20-5463-4bbc-be4b-fa5c379dae0c-kube-api-access-fr68b\") pod \"istiod-openshift-gateway-55ff986f96-rgd49\" (UID: \"293aaa20-5463-4bbc-be4b-fa5c379dae0c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:43.838012 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.837987 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/293aaa20-5463-4bbc-be4b-fa5c379dae0c-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-rgd49\" (UID: \"293aaa20-5463-4bbc-be4b-fa5c379dae0c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:43.839984 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.839956 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/293aaa20-5463-4bbc-be4b-fa5c379dae0c-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-rgd49\" (UID: \"293aaa20-5463-4bbc-be4b-fa5c379dae0c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:43.840094 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.839994 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/293aaa20-5463-4bbc-be4b-fa5c379dae0c-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-rgd49\" (UID: \"293aaa20-5463-4bbc-be4b-fa5c379dae0c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:43.840094 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.840004 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/293aaa20-5463-4bbc-be4b-fa5c379dae0c-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-rgd49\" (UID: \"293aaa20-5463-4bbc-be4b-fa5c379dae0c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:43.840094 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.840029 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/293aaa20-5463-4bbc-be4b-fa5c379dae0c-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-rgd49\" (UID: \"293aaa20-5463-4bbc-be4b-fa5c379dae0c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:43.857693 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.857663 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr68b\" (UniqueName: \"kubernetes.io/projected/293aaa20-5463-4bbc-be4b-fa5c379dae0c-kube-api-access-fr68b\") pod \"istiod-openshift-gateway-55ff986f96-rgd49\" (UID: \"293aaa20-5463-4bbc-be4b-fa5c379dae0c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:43.860598 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.860562 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/293aaa20-5463-4bbc-be4b-fa5c379dae0c-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-rgd49\" (UID: \"293aaa20-5463-4bbc-be4b-fa5c379dae0c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:43.968477 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:43.968448 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:44.144051 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:44.144026 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49"] Apr 24 21:24:44.146226 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:24:44.146201 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod293aaa20_5463_4bbc_be4b_fa5c379dae0c.slice/crio-8e44826dfd4c4aec3d44ac5f507d015fd1dc569047d4998fbce530db98a009c7 WatchSource:0}: Error finding container 8e44826dfd4c4aec3d44ac5f507d015fd1dc569047d4998fbce530db98a009c7: Status 404 returned error can't find the container with id 8e44826dfd4c4aec3d44ac5f507d015fd1dc569047d4998fbce530db98a009c7 Apr 24 21:24:44.148271 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:44.148237 2581 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 24 21:24:44.148332 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:44.148306 2581 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 24 21:24:44.891028 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:44.890994 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" event={"ID":"293aaa20-5463-4bbc-be4b-fa5c379dae0c","Type":"ContainerStarted","Data":"e89f7218db3b3918a53a57bb6202f78114a588cc41cc74dc361837d8716272d5"} Apr 24 21:24:44.891028 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:44.891032 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" event={"ID":"293aaa20-5463-4bbc-be4b-fa5c379dae0c","Type":"ContainerStarted","Data":"8e44826dfd4c4aec3d44ac5f507d015fd1dc569047d4998fbce530db98a009c7"} Apr 24 21:24:44.891450 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:44.891119 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:44.918558 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:44.918512 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" podStartSLOduration=1.91849957 podStartE2EDuration="1.91849957s" podCreationTimestamp="2026-04-24 21:24:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:24:44.91706715 +0000 UTC m=+525.595903823" watchObservedRunningTime="2026-04-24 21:24:44.91849957 +0000 UTC m=+525.597336241" Apr 24 21:24:45.896169 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:45.896139 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rgd49" Apr 24 21:24:45.959700 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:45.959669 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694"] Apr 24 21:24:45.959903 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:45.959882 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" podUID="1dd0fa2b-28ef-4aff-9929-e03212b3f28b" containerName="discovery" containerID="cri-o://f17c674785849db6f428d9261e21e08c56a10b4d42d085fef48aaa366bc20875" gracePeriod=30 Apr 24 21:24:46.234963 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.234927 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:24:46.364820 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.364790 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-cacerts\") pod \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " Apr 24 21:24:46.364991 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.364837 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8d9w\" (UniqueName: \"kubernetes.io/projected/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-kube-api-access-m8d9w\") pod \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " Apr 24 21:24:46.364991 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.364873 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-kubeconfig\") pod \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " Apr 24 21:24:46.364991 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.364899 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-local-certs\") pod \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " Apr 24 21:24:46.364991 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.364922 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-csr-ca-configmap\") pod \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " Apr 24 21:24:46.364991 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.364949 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-csr-dns-cert\") pod \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " Apr 24 21:24:46.364991 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.364985 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-token\") pod \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\" (UID: \"1dd0fa2b-28ef-4aff-9929-e03212b3f28b\") " Apr 24 21:24:46.365344 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.365315 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "1dd0fa2b-28ef-4aff-9929-e03212b3f28b" (UID: "1dd0fa2b-28ef-4aff-9929-e03212b3f28b"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:24:46.367371 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.367326 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-cacerts" (OuterVolumeSpecName: "cacerts") pod "1dd0fa2b-28ef-4aff-9929-e03212b3f28b" (UID: "1dd0fa2b-28ef-4aff-9929-e03212b3f28b"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:24:46.367667 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.367555 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-local-certs" (OuterVolumeSpecName: "local-certs") pod "1dd0fa2b-28ef-4aff-9929-e03212b3f28b" (UID: "1dd0fa2b-28ef-4aff-9929-e03212b3f28b"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:24:46.367667 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.367633 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "1dd0fa2b-28ef-4aff-9929-e03212b3f28b" (UID: "1dd0fa2b-28ef-4aff-9929-e03212b3f28b"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:24:46.367927 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.367680 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-kube-api-access-m8d9w" (OuterVolumeSpecName: "kube-api-access-m8d9w") pod "1dd0fa2b-28ef-4aff-9929-e03212b3f28b" (UID: "1dd0fa2b-28ef-4aff-9929-e03212b3f28b"). InnerVolumeSpecName "kube-api-access-m8d9w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:24:46.368212 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.368176 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "1dd0fa2b-28ef-4aff-9929-e03212b3f28b" (UID: "1dd0fa2b-28ef-4aff-9929-e03212b3f28b"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:24:46.371859 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.371833 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-token" (OuterVolumeSpecName: "istio-token") pod "1dd0fa2b-28ef-4aff-9929-e03212b3f28b" (UID: "1dd0fa2b-28ef-4aff-9929-e03212b3f28b"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:24:46.466160 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.466081 2581 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-cacerts\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:24:46.466160 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.466116 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m8d9w\" (UniqueName: \"kubernetes.io/projected/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-kube-api-access-m8d9w\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:24:46.466160 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.466132 2581 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-kubeconfig\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:24:46.466160 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.466147 2581 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-local-certs\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:24:46.466160 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.466160 2581 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-csr-ca-configmap\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:24:46.466431 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.466173 2581 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-csr-dns-cert\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:24:46.466431 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.466186 2581 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1dd0fa2b-28ef-4aff-9929-e03212b3f28b-istio-token\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:24:46.898207 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.898116 2581 generic.go:358] "Generic (PLEG): container finished" podID="1dd0fa2b-28ef-4aff-9929-e03212b3f28b" containerID="f17c674785849db6f428d9261e21e08c56a10b4d42d085fef48aaa366bc20875" exitCode=0 Apr 24 21:24:46.898207 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.898184 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" Apr 24 21:24:46.898701 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.898204 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" event={"ID":"1dd0fa2b-28ef-4aff-9929-e03212b3f28b","Type":"ContainerDied","Data":"f17c674785849db6f428d9261e21e08c56a10b4d42d085fef48aaa366bc20875"} Apr 24 21:24:46.898701 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.898241 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694" event={"ID":"1dd0fa2b-28ef-4aff-9929-e03212b3f28b","Type":"ContainerDied","Data":"19db6937aad8465d0b99fc96cbb9e60f22e6f046ec38ef21fc80db4317d2fb4e"} Apr 24 21:24:46.898701 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.898257 2581 scope.go:117] "RemoveContainer" containerID="f17c674785849db6f428d9261e21e08c56a10b4d42d085fef48aaa366bc20875" Apr 24 21:24:46.907228 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.907211 2581 scope.go:117] "RemoveContainer" containerID="f17c674785849db6f428d9261e21e08c56a10b4d42d085fef48aaa366bc20875" Apr 24 21:24:46.907482 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:24:46.907461 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f17c674785849db6f428d9261e21e08c56a10b4d42d085fef48aaa366bc20875\": container with ID starting with f17c674785849db6f428d9261e21e08c56a10b4d42d085fef48aaa366bc20875 not found: ID does not exist" containerID="f17c674785849db6f428d9261e21e08c56a10b4d42d085fef48aaa366bc20875" Apr 24 21:24:46.907528 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.907493 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f17c674785849db6f428d9261e21e08c56a10b4d42d085fef48aaa366bc20875"} err="failed to get container status \"f17c674785849db6f428d9261e21e08c56a10b4d42d085fef48aaa366bc20875\": rpc error: code = NotFound desc = could not find container \"f17c674785849db6f428d9261e21e08c56a10b4d42d085fef48aaa366bc20875\": container with ID starting with f17c674785849db6f428d9261e21e08c56a10b4d42d085fef48aaa366bc20875 not found: ID does not exist" Apr 24 21:24:46.921204 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.921183 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694"] Apr 24 21:24:46.923672 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:46.923651 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-7z694"] Apr 24 21:24:47.873848 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:47.873807 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dd0fa2b-28ef-4aff-9929-e03212b3f28b" path="/var/lib/kubelet/pods/1dd0fa2b-28ef-4aff-9929-e03212b3f28b/volumes" Apr 24 21:24:53.299214 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.299137 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-k8bvx"] Apr 24 21:24:53.299571 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.299511 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1dd0fa2b-28ef-4aff-9929-e03212b3f28b" containerName="discovery" Apr 24 21:24:53.299571 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.299525 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd0fa2b-28ef-4aff-9929-e03212b3f28b" containerName="discovery" Apr 24 21:24:53.299645 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.299577 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="1dd0fa2b-28ef-4aff-9929-e03212b3f28b" containerName="discovery" Apr 24 21:24:53.302371 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.302354 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-k8bvx" Apr 24 21:24:53.304977 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.304953 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 21:24:53.305213 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.305196 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-tqwvq\"" Apr 24 21:24:53.305752 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.305736 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:24:53.305914 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.305894 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:24:53.315740 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.315718 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-k8bvx"] Apr 24 21:24:53.328632 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.328610 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-6c9547c57-hr9ff"] Apr 24 21:24:53.331756 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.331735 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6c9547c57-hr9ff" Apr 24 21:24:53.334627 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.334606 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 21:24:53.334721 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.334614 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-w87s7\"" Apr 24 21:24:53.359156 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.359130 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6c9547c57-hr9ff"] Apr 24 21:24:53.424226 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.424191 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f94d9007-43fa-4097-a188-cd0d438d005e-cert\") pod \"kserve-controller-manager-74fc8f6f96-k8bvx\" (UID: \"f94d9007-43fa-4097-a188-cd0d438d005e\") " pod="kserve/kserve-controller-manager-74fc8f6f96-k8bvx" Apr 24 21:24:53.424396 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.424236 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81a8ae8b-f98a-4240-9832-35069d2f1c6e-cert\") pod \"llmisvc-controller-manager-6c9547c57-hr9ff\" (UID: \"81a8ae8b-f98a-4240-9832-35069d2f1c6e\") " pod="kserve/llmisvc-controller-manager-6c9547c57-hr9ff" Apr 24 21:24:53.424396 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.424281 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dth2c\" (UniqueName: \"kubernetes.io/projected/81a8ae8b-f98a-4240-9832-35069d2f1c6e-kube-api-access-dth2c\") pod \"llmisvc-controller-manager-6c9547c57-hr9ff\" (UID: \"81a8ae8b-f98a-4240-9832-35069d2f1c6e\") " pod="kserve/llmisvc-controller-manager-6c9547c57-hr9ff" Apr 24 21:24:53.424396 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.424329 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blfg5\" (UniqueName: \"kubernetes.io/projected/f94d9007-43fa-4097-a188-cd0d438d005e-kube-api-access-blfg5\") pod \"kserve-controller-manager-74fc8f6f96-k8bvx\" (UID: \"f94d9007-43fa-4097-a188-cd0d438d005e\") " pod="kserve/kserve-controller-manager-74fc8f6f96-k8bvx" Apr 24 21:24:53.524943 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.524904 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f94d9007-43fa-4097-a188-cd0d438d005e-cert\") pod \"kserve-controller-manager-74fc8f6f96-k8bvx\" (UID: \"f94d9007-43fa-4097-a188-cd0d438d005e\") " pod="kserve/kserve-controller-manager-74fc8f6f96-k8bvx" Apr 24 21:24:53.525132 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.524956 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81a8ae8b-f98a-4240-9832-35069d2f1c6e-cert\") pod \"llmisvc-controller-manager-6c9547c57-hr9ff\" (UID: \"81a8ae8b-f98a-4240-9832-35069d2f1c6e\") " pod="kserve/llmisvc-controller-manager-6c9547c57-hr9ff" Apr 24 21:24:53.525132 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.525010 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dth2c\" (UniqueName: \"kubernetes.io/projected/81a8ae8b-f98a-4240-9832-35069d2f1c6e-kube-api-access-dth2c\") pod \"llmisvc-controller-manager-6c9547c57-hr9ff\" (UID: \"81a8ae8b-f98a-4240-9832-35069d2f1c6e\") " pod="kserve/llmisvc-controller-manager-6c9547c57-hr9ff" Apr 24 21:24:53.525132 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.525038 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-blfg5\" (UniqueName: \"kubernetes.io/projected/f94d9007-43fa-4097-a188-cd0d438d005e-kube-api-access-blfg5\") pod \"kserve-controller-manager-74fc8f6f96-k8bvx\" (UID: \"f94d9007-43fa-4097-a188-cd0d438d005e\") " pod="kserve/kserve-controller-manager-74fc8f6f96-k8bvx" Apr 24 21:24:53.525132 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:24:53.525119 2581 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 24 21:24:53.525283 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:24:53.525215 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81a8ae8b-f98a-4240-9832-35069d2f1c6e-cert podName:81a8ae8b-f98a-4240-9832-35069d2f1c6e nodeName:}" failed. No retries permitted until 2026-04-24 21:24:54.025189523 +0000 UTC m=+534.704026180 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/81a8ae8b-f98a-4240-9832-35069d2f1c6e-cert") pod "llmisvc-controller-manager-6c9547c57-hr9ff" (UID: "81a8ae8b-f98a-4240-9832-35069d2f1c6e") : secret "llmisvc-webhook-server-cert" not found Apr 24 21:24:53.527440 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.527403 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f94d9007-43fa-4097-a188-cd0d438d005e-cert\") pod \"kserve-controller-manager-74fc8f6f96-k8bvx\" (UID: \"f94d9007-43fa-4097-a188-cd0d438d005e\") " pod="kserve/kserve-controller-manager-74fc8f6f96-k8bvx" Apr 24 21:24:53.536494 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.536463 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-blfg5\" (UniqueName: \"kubernetes.io/projected/f94d9007-43fa-4097-a188-cd0d438d005e-kube-api-access-blfg5\") pod \"kserve-controller-manager-74fc8f6f96-k8bvx\" (UID: \"f94d9007-43fa-4097-a188-cd0d438d005e\") " pod="kserve/kserve-controller-manager-74fc8f6f96-k8bvx" Apr 24 21:24:53.536646 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.536631 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dth2c\" (UniqueName: \"kubernetes.io/projected/81a8ae8b-f98a-4240-9832-35069d2f1c6e-kube-api-access-dth2c\") pod \"llmisvc-controller-manager-6c9547c57-hr9ff\" (UID: \"81a8ae8b-f98a-4240-9832-35069d2f1c6e\") " pod="kserve/llmisvc-controller-manager-6c9547c57-hr9ff" Apr 24 21:24:53.615635 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.615565 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-k8bvx" Apr 24 21:24:53.959646 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:53.959524 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-k8bvx"] Apr 24 21:24:53.962527 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:24:53.962497 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf94d9007_43fa_4097_a188_cd0d438d005e.slice/crio-5ae04f64e2043bee684c223755f33d2d57dc1841967b0c2bd595d716917d9a02 WatchSource:0}: Error finding container 5ae04f64e2043bee684c223755f33d2d57dc1841967b0c2bd595d716917d9a02: Status 404 returned error can't find the container with id 5ae04f64e2043bee684c223755f33d2d57dc1841967b0c2bd595d716917d9a02 Apr 24 21:24:54.029612 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:54.029587 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81a8ae8b-f98a-4240-9832-35069d2f1c6e-cert\") pod \"llmisvc-controller-manager-6c9547c57-hr9ff\" (UID: \"81a8ae8b-f98a-4240-9832-35069d2f1c6e\") " pod="kserve/llmisvc-controller-manager-6c9547c57-hr9ff" Apr 24 21:24:54.031837 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:54.031809 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81a8ae8b-f98a-4240-9832-35069d2f1c6e-cert\") pod \"llmisvc-controller-manager-6c9547c57-hr9ff\" (UID: \"81a8ae8b-f98a-4240-9832-35069d2f1c6e\") " pod="kserve/llmisvc-controller-manager-6c9547c57-hr9ff" Apr 24 21:24:54.244529 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:54.244454 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6c9547c57-hr9ff" Apr 24 21:24:54.579628 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:54.579596 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6c9547c57-hr9ff"] Apr 24 21:24:54.582841 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:24:54.582808 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod81a8ae8b_f98a_4240_9832_35069d2f1c6e.slice/crio-c05cc34ab2ca1505ad21f383f829c76d570c65cd849ed08bbfe36cc34d950e82 WatchSource:0}: Error finding container c05cc34ab2ca1505ad21f383f829c76d570c65cd849ed08bbfe36cc34d950e82: Status 404 returned error can't find the container with id c05cc34ab2ca1505ad21f383f829c76d570c65cd849ed08bbfe36cc34d950e82 Apr 24 21:24:54.933398 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:54.933345 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6c9547c57-hr9ff" event={"ID":"81a8ae8b-f98a-4240-9832-35069d2f1c6e","Type":"ContainerStarted","Data":"c05cc34ab2ca1505ad21f383f829c76d570c65cd849ed08bbfe36cc34d950e82"} Apr 24 21:24:54.934663 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:54.934633 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-k8bvx" event={"ID":"f94d9007-43fa-4097-a188-cd0d438d005e","Type":"ContainerStarted","Data":"5ae04f64e2043bee684c223755f33d2d57dc1841967b0c2bd595d716917d9a02"} Apr 24 21:24:56.944013 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:56.943978 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-k8bvx" event={"ID":"f94d9007-43fa-4097-a188-cd0d438d005e","Type":"ContainerStarted","Data":"b9d77c7fdeafe252b7f681c5387cf5b419dd59ac5d13f0eb4c7d4cba2e1e7a39"} Apr 24 21:24:56.944473 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:56.944108 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-74fc8f6f96-k8bvx" Apr 24 21:24:56.961205 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:56.961154 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-74fc8f6f96-k8bvx" podStartSLOduration=1.411594399 podStartE2EDuration="3.961135797s" podCreationTimestamp="2026-04-24 21:24:53 +0000 UTC" firstStartedPulling="2026-04-24 21:24:53.963636962 +0000 UTC m=+534.642473612" lastFinishedPulling="2026-04-24 21:24:56.51317836 +0000 UTC m=+537.192015010" observedRunningTime="2026-04-24 21:24:56.959966437 +0000 UTC m=+537.638803110" watchObservedRunningTime="2026-04-24 21:24:56.961135797 +0000 UTC m=+537.639972469" Apr 24 21:24:59.956175 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:59.956138 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6c9547c57-hr9ff" event={"ID":"81a8ae8b-f98a-4240-9832-35069d2f1c6e","Type":"ContainerStarted","Data":"1abaa4c7512bf3bd19daf7937df75d85d6565e9885b1c764de4b128b58c73e52"} Apr 24 21:24:59.956555 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:59.956249 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-6c9547c57-hr9ff" Apr 24 21:24:59.973213 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:24:59.973168 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-6c9547c57-hr9ff" podStartSLOduration=2.5019017569999997 podStartE2EDuration="6.973154781s" podCreationTimestamp="2026-04-24 21:24:53 +0000 UTC" firstStartedPulling="2026-04-24 21:24:54.584736977 +0000 UTC m=+535.263573639" lastFinishedPulling="2026-04-24 21:24:59.055990009 +0000 UTC m=+539.734826663" observedRunningTime="2026-04-24 21:24:59.972148038 +0000 UTC m=+540.650984721" watchObservedRunningTime="2026-04-24 21:24:59.973154781 +0000 UTC m=+540.651991453" Apr 24 21:25:27.953664 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:27.953634 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-74fc8f6f96-k8bvx" Apr 24 21:25:30.961788 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:30.961760 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-6c9547c57-hr9ff" Apr 24 21:25:32.198679 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:32.198641 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-k8bvx"] Apr 24 21:25:32.199067 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:32.198843 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-74fc8f6f96-k8bvx" podUID="f94d9007-43fa-4097-a188-cd0d438d005e" containerName="manager" containerID="cri-o://b9d77c7fdeafe252b7f681c5387cf5b419dd59ac5d13f0eb4c7d4cba2e1e7a39" gracePeriod=10 Apr 24 21:25:32.231837 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:32.231812 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-bfwqh"] Apr 24 21:25:32.235192 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:32.235176 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-bfwqh" Apr 24 21:25:32.245252 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:32.245226 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-bfwqh"] Apr 24 21:25:32.338661 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:32.338626 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kskd\" (UniqueName: \"kubernetes.io/projected/0c576436-6187-4ab4-8a5e-798cd5bd02c9-kube-api-access-9kskd\") pod \"kserve-controller-manager-74fc8f6f96-bfwqh\" (UID: \"0c576436-6187-4ab4-8a5e-798cd5bd02c9\") " pod="kserve/kserve-controller-manager-74fc8f6f96-bfwqh" Apr 24 21:25:32.338804 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:32.338769 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c576436-6187-4ab4-8a5e-798cd5bd02c9-cert\") pod \"kserve-controller-manager-74fc8f6f96-bfwqh\" (UID: \"0c576436-6187-4ab4-8a5e-798cd5bd02c9\") " pod="kserve/kserve-controller-manager-74fc8f6f96-bfwqh" Apr 24 21:25:32.430548 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:32.430526 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-k8bvx" Apr 24 21:25:32.439317 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:32.439294 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kskd\" (UniqueName: \"kubernetes.io/projected/0c576436-6187-4ab4-8a5e-798cd5bd02c9-kube-api-access-9kskd\") pod \"kserve-controller-manager-74fc8f6f96-bfwqh\" (UID: \"0c576436-6187-4ab4-8a5e-798cd5bd02c9\") " pod="kserve/kserve-controller-manager-74fc8f6f96-bfwqh" Apr 24 21:25:32.439412 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:32.439324 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c576436-6187-4ab4-8a5e-798cd5bd02c9-cert\") pod \"kserve-controller-manager-74fc8f6f96-bfwqh\" (UID: \"0c576436-6187-4ab4-8a5e-798cd5bd02c9\") " pod="kserve/kserve-controller-manager-74fc8f6f96-bfwqh" Apr 24 21:25:32.441498 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:32.441479 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c576436-6187-4ab4-8a5e-798cd5bd02c9-cert\") pod \"kserve-controller-manager-74fc8f6f96-bfwqh\" (UID: \"0c576436-6187-4ab4-8a5e-798cd5bd02c9\") " pod="kserve/kserve-controller-manager-74fc8f6f96-bfwqh" Apr 24 21:25:32.460601 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:32.460536 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kskd\" (UniqueName: \"kubernetes.io/projected/0c576436-6187-4ab4-8a5e-798cd5bd02c9-kube-api-access-9kskd\") pod \"kserve-controller-manager-74fc8f6f96-bfwqh\" (UID: \"0c576436-6187-4ab4-8a5e-798cd5bd02c9\") " pod="kserve/kserve-controller-manager-74fc8f6f96-bfwqh" Apr 24 21:25:32.540650 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:32.540624 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f94d9007-43fa-4097-a188-cd0d438d005e-cert\") pod \"f94d9007-43fa-4097-a188-cd0d438d005e\" (UID: \"f94d9007-43fa-4097-a188-cd0d438d005e\") " Apr 24 21:25:32.540781 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:32.540655 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blfg5\" (UniqueName: \"kubernetes.io/projected/f94d9007-43fa-4097-a188-cd0d438d005e-kube-api-access-blfg5\") pod \"f94d9007-43fa-4097-a188-cd0d438d005e\" (UID: \"f94d9007-43fa-4097-a188-cd0d438d005e\") " Apr 24 21:25:32.542625 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:32.542593 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f94d9007-43fa-4097-a188-cd0d438d005e-cert" (OuterVolumeSpecName: "cert") pod "f94d9007-43fa-4097-a188-cd0d438d005e" (UID: "f94d9007-43fa-4097-a188-cd0d438d005e"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:25:32.542721 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:32.542664 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f94d9007-43fa-4097-a188-cd0d438d005e-kube-api-access-blfg5" (OuterVolumeSpecName: "kube-api-access-blfg5") pod "f94d9007-43fa-4097-a188-cd0d438d005e" (UID: "f94d9007-43fa-4097-a188-cd0d438d005e"). InnerVolumeSpecName "kube-api-access-blfg5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:25:32.589364 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:32.589330 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-bfwqh" Apr 24 21:25:32.641608 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:32.641574 2581 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f94d9007-43fa-4097-a188-cd0d438d005e-cert\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:25:32.641756 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:32.641611 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-blfg5\" (UniqueName: \"kubernetes.io/projected/f94d9007-43fa-4097-a188-cd0d438d005e-kube-api-access-blfg5\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:25:32.709217 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:32.709169 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-bfwqh"] Apr 24 21:25:32.711671 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:25:32.711620 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c576436_6187_4ab4_8a5e_798cd5bd02c9.slice/crio-dfa4671cab60c600244485abec8dbe3f91d5987fd7f5d4f42c24f37c113153dc WatchSource:0}: Error finding container dfa4671cab60c600244485abec8dbe3f91d5987fd7f5d4f42c24f37c113153dc: Status 404 returned error can't find the container with id dfa4671cab60c600244485abec8dbe3f91d5987fd7f5d4f42c24f37c113153dc Apr 24 21:25:33.064874 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:33.064791 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-bfwqh" event={"ID":"0c576436-6187-4ab4-8a5e-798cd5bd02c9","Type":"ContainerStarted","Data":"dfa4671cab60c600244485abec8dbe3f91d5987fd7f5d4f42c24f37c113153dc"} Apr 24 21:25:33.065916 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:33.065891 2581 generic.go:358] "Generic (PLEG): container finished" podID="f94d9007-43fa-4097-a188-cd0d438d005e" containerID="b9d77c7fdeafe252b7f681c5387cf5b419dd59ac5d13f0eb4c7d4cba2e1e7a39" exitCode=0 Apr 24 21:25:33.066025 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:33.065947 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-k8bvx" event={"ID":"f94d9007-43fa-4097-a188-cd0d438d005e","Type":"ContainerDied","Data":"b9d77c7fdeafe252b7f681c5387cf5b419dd59ac5d13f0eb4c7d4cba2e1e7a39"} Apr 24 21:25:33.066025 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:33.065964 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-k8bvx" event={"ID":"f94d9007-43fa-4097-a188-cd0d438d005e","Type":"ContainerDied","Data":"5ae04f64e2043bee684c223755f33d2d57dc1841967b0c2bd595d716917d9a02"} Apr 24 21:25:33.066025 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:33.065979 2581 scope.go:117] "RemoveContainer" containerID="b9d77c7fdeafe252b7f681c5387cf5b419dd59ac5d13f0eb4c7d4cba2e1e7a39" Apr 24 21:25:33.066134 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:33.065978 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-k8bvx" Apr 24 21:25:33.074133 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:33.074117 2581 scope.go:117] "RemoveContainer" containerID="b9d77c7fdeafe252b7f681c5387cf5b419dd59ac5d13f0eb4c7d4cba2e1e7a39" Apr 24 21:25:33.074383 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:25:33.074365 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9d77c7fdeafe252b7f681c5387cf5b419dd59ac5d13f0eb4c7d4cba2e1e7a39\": container with ID starting with b9d77c7fdeafe252b7f681c5387cf5b419dd59ac5d13f0eb4c7d4cba2e1e7a39 not found: ID does not exist" containerID="b9d77c7fdeafe252b7f681c5387cf5b419dd59ac5d13f0eb4c7d4cba2e1e7a39" Apr 24 21:25:33.074463 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:33.074389 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9d77c7fdeafe252b7f681c5387cf5b419dd59ac5d13f0eb4c7d4cba2e1e7a39"} err="failed to get container status \"b9d77c7fdeafe252b7f681c5387cf5b419dd59ac5d13f0eb4c7d4cba2e1e7a39\": rpc error: code = NotFound desc = could not find container \"b9d77c7fdeafe252b7f681c5387cf5b419dd59ac5d13f0eb4c7d4cba2e1e7a39\": container with ID starting with b9d77c7fdeafe252b7f681c5387cf5b419dd59ac5d13f0eb4c7d4cba2e1e7a39 not found: ID does not exist" Apr 24 21:25:33.087542 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:33.087516 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-k8bvx"] Apr 24 21:25:33.091259 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:33.091236 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-k8bvx"] Apr 24 21:25:33.874336 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:33.874304 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f94d9007-43fa-4097-a188-cd0d438d005e" path="/var/lib/kubelet/pods/f94d9007-43fa-4097-a188-cd0d438d005e/volumes" Apr 24 21:25:34.070304 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:34.070265 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-bfwqh" event={"ID":"0c576436-6187-4ab4-8a5e-798cd5bd02c9","Type":"ContainerStarted","Data":"c45744d131657ef028056cd7e8edf97c827f2bd852816bc64c56002cb610c651"} Apr 24 21:25:34.070529 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:34.070331 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-74fc8f6f96-bfwqh" Apr 24 21:25:34.088656 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:34.088609 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-74fc8f6f96-bfwqh" podStartSLOduration=1.625586891 podStartE2EDuration="2.088597025s" podCreationTimestamp="2026-04-24 21:25:32 +0000 UTC" firstStartedPulling="2026-04-24 21:25:32.712767819 +0000 UTC m=+573.391604469" lastFinishedPulling="2026-04-24 21:25:33.17577795 +0000 UTC m=+573.854614603" observedRunningTime="2026-04-24 21:25:34.087136739 +0000 UTC m=+574.765973410" watchObservedRunningTime="2026-04-24 21:25:34.088597025 +0000 UTC m=+574.767433712" Apr 24 21:25:59.813408 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:59.813382 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/1.log" Apr 24 21:25:59.815491 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:59.815470 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/1.log" Apr 24 21:25:59.818848 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:59.818830 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 21:25:59.820953 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:25:59.820934 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 21:26:05.079637 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:05.079603 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-74fc8f6f96-bfwqh" Apr 24 21:26:06.005353 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:06.005321 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-znlgd"] Apr 24 21:26:06.006096 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:06.006061 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f94d9007-43fa-4097-a188-cd0d438d005e" containerName="manager" Apr 24 21:26:06.006096 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:06.006081 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94d9007-43fa-4097-a188-cd0d438d005e" containerName="manager" Apr 24 21:26:06.006341 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:06.006321 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="f94d9007-43fa-4097-a188-cd0d438d005e" containerName="manager" Apr 24 21:26:06.009128 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:06.009107 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-znlgd" Apr 24 21:26:06.011504 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:06.011478 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 21:26:06.011635 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:06.011524 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-cssnz\"" Apr 24 21:26:06.015575 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:06.015550 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-znlgd"] Apr 24 21:26:06.120625 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:06.120587 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0589c56e-7987-421c-82d1-0b565b112246-tls-certs\") pod \"model-serving-api-86f7b4b499-znlgd\" (UID: \"0589c56e-7987-421c-82d1-0b565b112246\") " pod="kserve/model-serving-api-86f7b4b499-znlgd" Apr 24 21:26:06.120625 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:06.120625 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpvq8\" (UniqueName: \"kubernetes.io/projected/0589c56e-7987-421c-82d1-0b565b112246-kube-api-access-hpvq8\") pod \"model-serving-api-86f7b4b499-znlgd\" (UID: \"0589c56e-7987-421c-82d1-0b565b112246\") " pod="kserve/model-serving-api-86f7b4b499-znlgd" Apr 24 21:26:06.221237 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:06.221202 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0589c56e-7987-421c-82d1-0b565b112246-tls-certs\") pod \"model-serving-api-86f7b4b499-znlgd\" (UID: \"0589c56e-7987-421c-82d1-0b565b112246\") " pod="kserve/model-serving-api-86f7b4b499-znlgd" Apr 24 21:26:06.221458 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:06.221248 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpvq8\" (UniqueName: \"kubernetes.io/projected/0589c56e-7987-421c-82d1-0b565b112246-kube-api-access-hpvq8\") pod \"model-serving-api-86f7b4b499-znlgd\" (UID: \"0589c56e-7987-421c-82d1-0b565b112246\") " pod="kserve/model-serving-api-86f7b4b499-znlgd" Apr 24 21:26:06.221458 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:26:06.221367 2581 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 24 21:26:06.221595 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:26:06.221473 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0589c56e-7987-421c-82d1-0b565b112246-tls-certs podName:0589c56e-7987-421c-82d1-0b565b112246 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:06.721451296 +0000 UTC m=+607.400287950 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/0589c56e-7987-421c-82d1-0b565b112246-tls-certs") pod "model-serving-api-86f7b4b499-znlgd" (UID: "0589c56e-7987-421c-82d1-0b565b112246") : secret "model-serving-api-tls" not found Apr 24 21:26:06.231265 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:06.231239 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpvq8\" (UniqueName: \"kubernetes.io/projected/0589c56e-7987-421c-82d1-0b565b112246-kube-api-access-hpvq8\") pod \"model-serving-api-86f7b4b499-znlgd\" (UID: \"0589c56e-7987-421c-82d1-0b565b112246\") " pod="kserve/model-serving-api-86f7b4b499-znlgd" Apr 24 21:26:06.724523 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:06.724489 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0589c56e-7987-421c-82d1-0b565b112246-tls-certs\") pod \"model-serving-api-86f7b4b499-znlgd\" (UID: \"0589c56e-7987-421c-82d1-0b565b112246\") " pod="kserve/model-serving-api-86f7b4b499-znlgd" Apr 24 21:26:06.727009 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:06.726984 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0589c56e-7987-421c-82d1-0b565b112246-tls-certs\") pod \"model-serving-api-86f7b4b499-znlgd\" (UID: \"0589c56e-7987-421c-82d1-0b565b112246\") " pod="kserve/model-serving-api-86f7b4b499-znlgd" Apr 24 21:26:06.922505 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:06.922472 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-znlgd" Apr 24 21:26:07.045470 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:07.045447 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-znlgd"] Apr 24 21:26:07.047447 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:26:07.047397 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0589c56e_7987_421c_82d1_0b565b112246.slice/crio-268a45836a51eaf45d09f18d0ad2f7a5a52c54554cacf868a3ac8d27fc3267d9 WatchSource:0}: Error finding container 268a45836a51eaf45d09f18d0ad2f7a5a52c54554cacf868a3ac8d27fc3267d9: Status 404 returned error can't find the container with id 268a45836a51eaf45d09f18d0ad2f7a5a52c54554cacf868a3ac8d27fc3267d9 Apr 24 21:26:07.176971 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:07.176939 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-znlgd" event={"ID":"0589c56e-7987-421c-82d1-0b565b112246","Type":"ContainerStarted","Data":"268a45836a51eaf45d09f18d0ad2f7a5a52c54554cacf868a3ac8d27fc3267d9"} Apr 24 21:26:09.185785 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:09.185743 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-znlgd" event={"ID":"0589c56e-7987-421c-82d1-0b565b112246","Type":"ContainerStarted","Data":"018ee32b4a2d459a4ddc23efe0c7e677025f5f7d6c19667e9cae5a0399f07bb6"} Apr 24 21:26:09.186190 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:09.185858 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-znlgd" Apr 24 21:26:09.201672 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:09.201621 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-znlgd" podStartSLOduration=2.7440004 podStartE2EDuration="4.201604387s" podCreationTimestamp="2026-04-24 21:26:05 +0000 UTC" firstStartedPulling="2026-04-24 21:26:07.049527553 +0000 UTC m=+607.728364204" lastFinishedPulling="2026-04-24 21:26:08.50713153 +0000 UTC m=+609.185968191" observedRunningTime="2026-04-24 21:26:09.200267764 +0000 UTC m=+609.879104447" watchObservedRunningTime="2026-04-24 21:26:09.201604387 +0000 UTC m=+609.880441060" Apr 24 21:26:20.193336 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:26:20.193264 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-znlgd" Apr 24 21:27:29.267581 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.267548 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g"] Apr 24 21:27:29.271149 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.271129 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:27:29.273681 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.273655 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-qdwck\"" Apr 24 21:27:29.274360 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.274338 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 24 21:27:29.274540 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.274407 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:27:29.274655 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.274409 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:27:29.274726 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.274492 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-p58m4\"" Apr 24 21:27:29.280770 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.280750 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g"] Apr 24 21:27:29.377755 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.377719 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:27:29.377755 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.377758 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:27:29.377949 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.377791 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:27:29.377949 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.377838 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:27:29.377949 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.377887 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:27:29.377949 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.377907 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdk4f\" (UniqueName: \"kubernetes.io/projected/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-kube-api-access-tdk4f\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:27:29.478629 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.478585 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:27:29.478629 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.478630 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:27:29.478836 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.478754 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:27:29.478836 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.478805 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:27:29.478911 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.478891 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:27:29.478948 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.478930 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tdk4f\" (UniqueName: \"kubernetes.io/projected/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-kube-api-access-tdk4f\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:27:29.479153 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.479133 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:27:29.479200 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.479134 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:27:29.479200 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.479179 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:27:29.479281 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.479268 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:27:29.481288 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.481265 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:27:29.487371 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.487351 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdk4f\" (UniqueName: \"kubernetes.io/projected/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-kube-api-access-tdk4f\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:27:29.582013 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.581924 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:27:29.918136 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:29.918111 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g"] Apr 24 21:27:29.920764 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:27:29.920731 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81e8b5c7_b400_4b9e_a01a_6d3df845c17d.slice/crio-d66d8022563ce13215142eac61bec1f4393635ff100b1abe53c5e5df800614af WatchSource:0}: Error finding container d66d8022563ce13215142eac61bec1f4393635ff100b1abe53c5e5df800614af: Status 404 returned error can't find the container with id d66d8022563ce13215142eac61bec1f4393635ff100b1abe53c5e5df800614af Apr 24 21:27:30.458740 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:30.458680 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" event={"ID":"81e8b5c7-b400-4b9e-a01a-6d3df845c17d","Type":"ContainerStarted","Data":"d66d8022563ce13215142eac61bec1f4393635ff100b1abe53c5e5df800614af"} Apr 24 21:27:33.470136 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:33.470100 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" event={"ID":"81e8b5c7-b400-4b9e-a01a-6d3df845c17d","Type":"ContainerStarted","Data":"d027c391dc5c11f25986f79f68cfdef4000db88564e49a0940c409ea14982d50"} Apr 24 21:27:34.474666 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:34.474629 2581 generic.go:358] "Generic (PLEG): container finished" podID="81e8b5c7-b400-4b9e-a01a-6d3df845c17d" containerID="d027c391dc5c11f25986f79f68cfdef4000db88564e49a0940c409ea14982d50" exitCode=0 Apr 24 21:27:34.475039 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:34.474678 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" event={"ID":"81e8b5c7-b400-4b9e-a01a-6d3df845c17d","Type":"ContainerDied","Data":"d027c391dc5c11f25986f79f68cfdef4000db88564e49a0940c409ea14982d50"} Apr 24 21:27:36.484750 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:27:36.484712 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" event={"ID":"81e8b5c7-b400-4b9e-a01a-6d3df845c17d","Type":"ContainerStarted","Data":"81135bbaed82a8ee92476b26041a0d22f1bedc6ee98fbeb477de0ea6e5ad2091"} Apr 24 21:28:06.608030 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:06.607984 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" event={"ID":"81e8b5c7-b400-4b9e-a01a-6d3df845c17d","Type":"ContainerStarted","Data":"fa72a7e900b3c486c4fc9e6b5299afc3627e1b6fc229fe1e014dc97e9555ccf8"} Apr 24 21:28:06.608539 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:06.608142 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:28:06.610601 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:06.610580 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:28:06.629856 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:06.629802 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" podStartSLOduration=1.956259754 podStartE2EDuration="37.629788804s" podCreationTimestamp="2026-04-24 21:27:29 +0000 UTC" firstStartedPulling="2026-04-24 21:27:29.923132886 +0000 UTC m=+690.601969553" lastFinishedPulling="2026-04-24 21:28:05.596661948 +0000 UTC m=+726.275498603" observedRunningTime="2026-04-24 21:28:06.627931167 +0000 UTC m=+727.306767840" watchObservedRunningTime="2026-04-24 21:28:06.629788804 +0000 UTC m=+727.308625477" Apr 24 21:28:09.582991 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:09.582956 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:28:09.583435 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:09.583004 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:28:19.584731 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:19.584689 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:28:19.585820 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:19.585800 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:28:40.390539 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:40.390500 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g"] Apr 24 21:28:40.391037 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:40.390876 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" podUID="81e8b5c7-b400-4b9e-a01a-6d3df845c17d" containerName="main" containerID="cri-o://81135bbaed82a8ee92476b26041a0d22f1bedc6ee98fbeb477de0ea6e5ad2091" gracePeriod=30 Apr 24 21:28:40.391037 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:40.391001 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" podUID="81e8b5c7-b400-4b9e-a01a-6d3df845c17d" containerName="tokenizer" containerID="cri-o://fa72a7e900b3c486c4fc9e6b5299afc3627e1b6fc229fe1e014dc97e9555ccf8" gracePeriod=30 Apr 24 21:28:40.731740 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:40.731694 2581 generic.go:358] "Generic (PLEG): container finished" podID="81e8b5c7-b400-4b9e-a01a-6d3df845c17d" containerID="81135bbaed82a8ee92476b26041a0d22f1bedc6ee98fbeb477de0ea6e5ad2091" exitCode=0 Apr 24 21:28:40.731893 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:40.731762 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" event={"ID":"81e8b5c7-b400-4b9e-a01a-6d3df845c17d","Type":"ContainerDied","Data":"81135bbaed82a8ee92476b26041a0d22f1bedc6ee98fbeb477de0ea6e5ad2091"} Apr 24 21:28:41.550246 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.550215 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:28:41.610079 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.610019 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdk4f\" (UniqueName: \"kubernetes.io/projected/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-kube-api-access-tdk4f\") pod \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " Apr 24 21:28:41.610079 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.610077 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tokenizer-tmp\") pod \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " Apr 24 21:28:41.610249 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.610094 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tokenizer-cache\") pod \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " Apr 24 21:28:41.610249 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.610134 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tokenizer-uds\") pod \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " Apr 24 21:28:41.610249 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.610178 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-kserve-provision-location\") pod \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " Apr 24 21:28:41.610249 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.610207 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tls-certs\") pod \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\" (UID: \"81e8b5c7-b400-4b9e-a01a-6d3df845c17d\") " Apr 24 21:28:41.610474 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.610406 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "81e8b5c7-b400-4b9e-a01a-6d3df845c17d" (UID: "81e8b5c7-b400-4b9e-a01a-6d3df845c17d"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:28:41.610474 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.610447 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "81e8b5c7-b400-4b9e-a01a-6d3df845c17d" (UID: "81e8b5c7-b400-4b9e-a01a-6d3df845c17d"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:28:41.610598 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.610524 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "81e8b5c7-b400-4b9e-a01a-6d3df845c17d" (UID: "81e8b5c7-b400-4b9e-a01a-6d3df845c17d"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:28:41.610962 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.610936 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "81e8b5c7-b400-4b9e-a01a-6d3df845c17d" (UID: "81e8b5c7-b400-4b9e-a01a-6d3df845c17d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:28:41.612226 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.612201 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-kube-api-access-tdk4f" (OuterVolumeSpecName: "kube-api-access-tdk4f") pod "81e8b5c7-b400-4b9e-a01a-6d3df845c17d" (UID: "81e8b5c7-b400-4b9e-a01a-6d3df845c17d"). InnerVolumeSpecName "kube-api-access-tdk4f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:28:41.612305 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.612254 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "81e8b5c7-b400-4b9e-a01a-6d3df845c17d" (UID: "81e8b5c7-b400-4b9e-a01a-6d3df845c17d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:28:41.711280 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.711253 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tokenizer-uds\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:28:41.711280 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.711279 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-kserve-provision-location\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:28:41.711458 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.711290 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tls-certs\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:28:41.711458 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.711300 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tdk4f\" (UniqueName: \"kubernetes.io/projected/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-kube-api-access-tdk4f\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:28:41.711458 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.711309 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tokenizer-tmp\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:28:41.711458 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.711318 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/81e8b5c7-b400-4b9e-a01a-6d3df845c17d-tokenizer-cache\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:28:41.736662 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.736634 2581 generic.go:358] "Generic (PLEG): container finished" podID="81e8b5c7-b400-4b9e-a01a-6d3df845c17d" containerID="fa72a7e900b3c486c4fc9e6b5299afc3627e1b6fc229fe1e014dc97e9555ccf8" exitCode=0 Apr 24 21:28:41.736783 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.736675 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" event={"ID":"81e8b5c7-b400-4b9e-a01a-6d3df845c17d","Type":"ContainerDied","Data":"fa72a7e900b3c486c4fc9e6b5299afc3627e1b6fc229fe1e014dc97e9555ccf8"} Apr 24 21:28:41.736783 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.736698 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" event={"ID":"81e8b5c7-b400-4b9e-a01a-6d3df845c17d","Type":"ContainerDied","Data":"d66d8022563ce13215142eac61bec1f4393635ff100b1abe53c5e5df800614af"} Apr 24 21:28:41.736783 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.736702 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g" Apr 24 21:28:41.736783 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.736720 2581 scope.go:117] "RemoveContainer" containerID="fa72a7e900b3c486c4fc9e6b5299afc3627e1b6fc229fe1e014dc97e9555ccf8" Apr 24 21:28:41.745322 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.745308 2581 scope.go:117] "RemoveContainer" containerID="81135bbaed82a8ee92476b26041a0d22f1bedc6ee98fbeb477de0ea6e5ad2091" Apr 24 21:28:41.752312 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.752294 2581 scope.go:117] "RemoveContainer" containerID="d027c391dc5c11f25986f79f68cfdef4000db88564e49a0940c409ea14982d50" Apr 24 21:28:41.760517 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.760496 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g"] Apr 24 21:28:41.760593 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.760564 2581 scope.go:117] "RemoveContainer" containerID="fa72a7e900b3c486c4fc9e6b5299afc3627e1b6fc229fe1e014dc97e9555ccf8" Apr 24 21:28:41.760893 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:28:41.760858 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa72a7e900b3c486c4fc9e6b5299afc3627e1b6fc229fe1e014dc97e9555ccf8\": container with ID starting with fa72a7e900b3c486c4fc9e6b5299afc3627e1b6fc229fe1e014dc97e9555ccf8 not found: ID does not exist" containerID="fa72a7e900b3c486c4fc9e6b5299afc3627e1b6fc229fe1e014dc97e9555ccf8" Apr 24 21:28:41.761057 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.760901 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa72a7e900b3c486c4fc9e6b5299afc3627e1b6fc229fe1e014dc97e9555ccf8"} err="failed to get container status \"fa72a7e900b3c486c4fc9e6b5299afc3627e1b6fc229fe1e014dc97e9555ccf8\": rpc error: code = NotFound desc = could not find container \"fa72a7e900b3c486c4fc9e6b5299afc3627e1b6fc229fe1e014dc97e9555ccf8\": container with ID starting with fa72a7e900b3c486c4fc9e6b5299afc3627e1b6fc229fe1e014dc97e9555ccf8 not found: ID does not exist" Apr 24 21:28:41.761057 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.760925 2581 scope.go:117] "RemoveContainer" containerID="81135bbaed82a8ee92476b26041a0d22f1bedc6ee98fbeb477de0ea6e5ad2091" Apr 24 21:28:41.761568 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:28:41.761546 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81135bbaed82a8ee92476b26041a0d22f1bedc6ee98fbeb477de0ea6e5ad2091\": container with ID starting with 81135bbaed82a8ee92476b26041a0d22f1bedc6ee98fbeb477de0ea6e5ad2091 not found: ID does not exist" containerID="81135bbaed82a8ee92476b26041a0d22f1bedc6ee98fbeb477de0ea6e5ad2091" Apr 24 21:28:41.761655 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.761571 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81135bbaed82a8ee92476b26041a0d22f1bedc6ee98fbeb477de0ea6e5ad2091"} err="failed to get container status \"81135bbaed82a8ee92476b26041a0d22f1bedc6ee98fbeb477de0ea6e5ad2091\": rpc error: code = NotFound desc = could not find container \"81135bbaed82a8ee92476b26041a0d22f1bedc6ee98fbeb477de0ea6e5ad2091\": container with ID starting with 81135bbaed82a8ee92476b26041a0d22f1bedc6ee98fbeb477de0ea6e5ad2091 not found: ID does not exist" Apr 24 21:28:41.761655 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.761587 2581 scope.go:117] "RemoveContainer" containerID="d027c391dc5c11f25986f79f68cfdef4000db88564e49a0940c409ea14982d50" Apr 24 21:28:41.761843 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:28:41.761826 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d027c391dc5c11f25986f79f68cfdef4000db88564e49a0940c409ea14982d50\": container with ID starting with d027c391dc5c11f25986f79f68cfdef4000db88564e49a0940c409ea14982d50 not found: ID does not exist" containerID="d027c391dc5c11f25986f79f68cfdef4000db88564e49a0940c409ea14982d50" Apr 24 21:28:41.761891 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.761848 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d027c391dc5c11f25986f79f68cfdef4000db88564e49a0940c409ea14982d50"} err="failed to get container status \"d027c391dc5c11f25986f79f68cfdef4000db88564e49a0940c409ea14982d50\": rpc error: code = NotFound desc = could not find container \"d027c391dc5c11f25986f79f68cfdef4000db88564e49a0940c409ea14982d50\": container with ID starting with d027c391dc5c11f25986f79f68cfdef4000db88564e49a0940c409ea14982d50 not found: ID does not exist" Apr 24 21:28:41.763004 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.762985 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d95cc8465g"] Apr 24 21:28:41.874456 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:28:41.874379 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81e8b5c7-b400-4b9e-a01a-6d3df845c17d" path="/var/lib/kubelet/pods/81e8b5c7-b400-4b9e-a01a-6d3df845c17d/volumes" Apr 24 21:29:01.210012 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.209978 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7"] Apr 24 21:29:01.210680 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.210504 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81e8b5c7-b400-4b9e-a01a-6d3df845c17d" containerName="tokenizer" Apr 24 21:29:01.210680 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.210523 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e8b5c7-b400-4b9e-a01a-6d3df845c17d" containerName="tokenizer" Apr 24 21:29:01.210680 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.210551 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81e8b5c7-b400-4b9e-a01a-6d3df845c17d" containerName="main" Apr 24 21:29:01.210680 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.210561 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e8b5c7-b400-4b9e-a01a-6d3df845c17d" containerName="main" Apr 24 21:29:01.210680 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.210588 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81e8b5c7-b400-4b9e-a01a-6d3df845c17d" containerName="storage-initializer" Apr 24 21:29:01.210680 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.210600 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e8b5c7-b400-4b9e-a01a-6d3df845c17d" containerName="storage-initializer" Apr 24 21:29:01.211002 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.210713 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="81e8b5c7-b400-4b9e-a01a-6d3df845c17d" containerName="main" Apr 24 21:29:01.211002 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.210728 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="81e8b5c7-b400-4b9e-a01a-6d3df845c17d" containerName="tokenizer" Apr 24 21:29:01.214570 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.214551 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:01.217171 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.217149 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-rfvh2\"" Apr 24 21:29:01.217280 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.217159 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:29:01.217928 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.217912 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-qdwck\"" Apr 24 21:29:01.217997 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.217971 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 24 21:29:01.218051 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.218027 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:29:01.226519 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.226495 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7"] Apr 24 21:29:01.368122 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.368079 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/666a701f-83d3-4366-a1b8-83f08a211715-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:01.368122 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.368125 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkzbl\" (UniqueName: \"kubernetes.io/projected/666a701f-83d3-4366-a1b8-83f08a211715-kube-api-access-kkzbl\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:01.368329 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.368162 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:01.368329 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.368183 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:01.368329 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.368205 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:01.368329 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.368288 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:01.469190 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.469105 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kkzbl\" (UniqueName: \"kubernetes.io/projected/666a701f-83d3-4366-a1b8-83f08a211715-kube-api-access-kkzbl\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:01.469190 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.469154 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:01.469190 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.469177 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:01.469464 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.469200 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:01.469464 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.469228 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:01.469464 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.469289 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/666a701f-83d3-4366-a1b8-83f08a211715-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:01.469682 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.469619 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:01.469682 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.469643 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:01.469682 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.469668 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:01.469859 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.469740 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:01.471834 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.471810 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/666a701f-83d3-4366-a1b8-83f08a211715-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:01.479418 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.479395 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkzbl\" (UniqueName: \"kubernetes.io/projected/666a701f-83d3-4366-a1b8-83f08a211715-kube-api-access-kkzbl\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:01.525448 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.525392 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:01.654369 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.654323 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7"] Apr 24 21:29:01.657040 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:29:01.657011 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod666a701f_83d3_4366_a1b8_83f08a211715.slice/crio-cbb65b87fb6ba29325cd4c4b71e33c2e89ce2097604c301e14c66ef3e20f4c33 WatchSource:0}: Error finding container cbb65b87fb6ba29325cd4c4b71e33c2e89ce2097604c301e14c66ef3e20f4c33: Status 404 returned error can't find the container with id cbb65b87fb6ba29325cd4c4b71e33c2e89ce2097604c301e14c66ef3e20f4c33 Apr 24 21:29:01.658827 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.658809 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:29:01.815491 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.815452 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" event={"ID":"666a701f-83d3-4366-a1b8-83f08a211715","Type":"ContainerStarted","Data":"e07d230b16ebb9b7b92706c5b57d9fe4dbdee1915a6e5ed982af13014dda7140"} Apr 24 21:29:01.815491 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:01.815496 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" event={"ID":"666a701f-83d3-4366-a1b8-83f08a211715","Type":"ContainerStarted","Data":"cbb65b87fb6ba29325cd4c4b71e33c2e89ce2097604c301e14c66ef3e20f4c33"} Apr 24 21:29:02.820466 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:02.820352 2581 generic.go:358] "Generic (PLEG): container finished" podID="666a701f-83d3-4366-a1b8-83f08a211715" containerID="e07d230b16ebb9b7b92706c5b57d9fe4dbdee1915a6e5ed982af13014dda7140" exitCode=0 Apr 24 21:29:02.820905 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:02.820460 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" event={"ID":"666a701f-83d3-4366-a1b8-83f08a211715","Type":"ContainerDied","Data":"e07d230b16ebb9b7b92706c5b57d9fe4dbdee1915a6e5ed982af13014dda7140"} Apr 24 21:29:03.826850 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:03.826812 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" event={"ID":"666a701f-83d3-4366-a1b8-83f08a211715","Type":"ContainerStarted","Data":"84798a14d0f3e973e8f6152fb60f3fa07b503d3a51f4ae3e2d0915fe623c6b68"} Apr 24 21:29:03.826850 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:03.826847 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" event={"ID":"666a701f-83d3-4366-a1b8-83f08a211715","Type":"ContainerStarted","Data":"ebacd73ef3e80111fabe9bd2ced9840b360354b5062a18509b869dce21d4e097"} Apr 24 21:29:03.827385 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:03.826960 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:11.525684 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:11.525648 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:11.526135 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:11.525695 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:11.528336 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:11.528309 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:11.548100 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:11.548051 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" podStartSLOduration=10.548035502 podStartE2EDuration="10.548035502s" podCreationTimestamp="2026-04-24 21:29:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:29:03.847527794 +0000 UTC m=+784.526364467" watchObservedRunningTime="2026-04-24 21:29:11.548035502 +0000 UTC m=+792.226872174" Apr 24 21:29:11.857193 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:11.857112 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:26.802627 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.802552 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx"] Apr 24 21:29:26.806387 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.806366 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:26.808687 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.808668 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 24 21:29:26.808787 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.808730 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-s22xc\"" Apr 24 21:29:26.816538 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.816514 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx"] Apr 24 21:29:26.880725 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.880690 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:26.880871 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.880748 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:26.880871 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.880775 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:26.880871 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.880800 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z8d9\" (UniqueName: \"kubernetes.io/projected/63a47288-d4a7-4ef4-b21b-8744f2a58afe-kube-api-access-6z8d9\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:26.880871 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.880826 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:26.881021 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.880889 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:26.982281 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.982236 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:26.982471 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.982289 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:26.982471 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.982358 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:26.982585 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.982485 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:26.982585 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.982532 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:26.982585 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.982561 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6z8d9\" (UniqueName: \"kubernetes.io/projected/63a47288-d4a7-4ef4-b21b-8744f2a58afe-kube-api-access-6z8d9\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:26.982745 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.982689 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:26.982745 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.982732 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:26.982849 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.982822 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:26.982849 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.982820 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:26.984724 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.984705 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:26.992268 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:26.992248 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z8d9\" (UniqueName: \"kubernetes.io/projected/63a47288-d4a7-4ef4-b21b-8744f2a58afe-kube-api-access-6z8d9\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:27.118983 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:27.118903 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:27.249032 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:27.249002 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx"] Apr 24 21:29:27.251903 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:29:27.251873 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63a47288_d4a7_4ef4_b21b_8744f2a58afe.slice/crio-511c9cd55779b00d41cd172fcbdd7d1bcd0544b61b4904825d40ec96f3a27630 WatchSource:0}: Error finding container 511c9cd55779b00d41cd172fcbdd7d1bcd0544b61b4904825d40ec96f3a27630: Status 404 returned error can't find the container with id 511c9cd55779b00d41cd172fcbdd7d1bcd0544b61b4904825d40ec96f3a27630 Apr 24 21:29:27.910916 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:27.910880 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" event={"ID":"63a47288-d4a7-4ef4-b21b-8744f2a58afe","Type":"ContainerStarted","Data":"a1506101aed5376666b25784ddde934617daed3ba67aaa6099fcae5f4974a2a7"} Apr 24 21:29:27.910916 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:27.910919 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" event={"ID":"63a47288-d4a7-4ef4-b21b-8744f2a58afe","Type":"ContainerStarted","Data":"511c9cd55779b00d41cd172fcbdd7d1bcd0544b61b4904825d40ec96f3a27630"} Apr 24 21:29:28.915331 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:28.915294 2581 generic.go:358] "Generic (PLEG): container finished" podID="63a47288-d4a7-4ef4-b21b-8744f2a58afe" containerID="a1506101aed5376666b25784ddde934617daed3ba67aaa6099fcae5f4974a2a7" exitCode=0 Apr 24 21:29:28.915725 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:28.915375 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" event={"ID":"63a47288-d4a7-4ef4-b21b-8744f2a58afe","Type":"ContainerDied","Data":"a1506101aed5376666b25784ddde934617daed3ba67aaa6099fcae5f4974a2a7"} Apr 24 21:29:29.920247 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:29.920208 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" event={"ID":"63a47288-d4a7-4ef4-b21b-8744f2a58afe","Type":"ContainerStarted","Data":"2c9c152cae51f900aaf6d70ec5f72998e25457ccb232c7c124dd8fe559696ca7"} Apr 24 21:29:29.920636 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:29.920253 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" event={"ID":"63a47288-d4a7-4ef4-b21b-8744f2a58afe","Type":"ContainerStarted","Data":"161a654396d412ba7ac2082b5cd4be476754220a5f0d568615e730067cecc3d3"} Apr 24 21:29:29.920636 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:29.920357 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:29.941276 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:29.941227 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" podStartSLOduration=3.941212827 podStartE2EDuration="3.941212827s" podCreationTimestamp="2026-04-24 21:29:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:29:29.939529041 +0000 UTC m=+810.618365714" watchObservedRunningTime="2026-04-24 21:29:29.941212827 +0000 UTC m=+810.620049499" Apr 24 21:29:32.861583 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:32.861554 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:29:37.120107 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:37.120021 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:37.120107 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:37.120064 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:37.122580 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:37.122547 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:37.949286 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:37.949254 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:29:58.953470 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:29:58.953419 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:30:13.224134 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:13.224096 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7"] Apr 24 21:30:13.224718 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:13.224522 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" podUID="666a701f-83d3-4366-a1b8-83f08a211715" containerName="main" containerID="cri-o://ebacd73ef3e80111fabe9bd2ced9840b360354b5062a18509b869dce21d4e097" gracePeriod=30 Apr 24 21:30:13.224718 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:13.224698 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" podUID="666a701f-83d3-4366-a1b8-83f08a211715" containerName="tokenizer" containerID="cri-o://84798a14d0f3e973e8f6152fb60f3fa07b503d3a51f4ae3e2d0915fe623c6b68" gracePeriod=30 Apr 24 21:30:14.074929 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:14.074896 2581 generic.go:358] "Generic (PLEG): container finished" podID="666a701f-83d3-4366-a1b8-83f08a211715" containerID="ebacd73ef3e80111fabe9bd2ced9840b360354b5062a18509b869dce21d4e097" exitCode=0 Apr 24 21:30:14.075100 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:14.074970 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" event={"ID":"666a701f-83d3-4366-a1b8-83f08a211715","Type":"ContainerDied","Data":"ebacd73ef3e80111fabe9bd2ced9840b360354b5062a18509b869dce21d4e097"} Apr 24 21:30:14.383784 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:14.383761 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:30:14.485406 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:14.485376 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-kserve-provision-location\") pod \"666a701f-83d3-4366-a1b8-83f08a211715\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " Apr 24 21:30:14.485585 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:14.485476 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-tokenizer-cache\") pod \"666a701f-83d3-4366-a1b8-83f08a211715\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " Apr 24 21:30:14.485585 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:14.485504 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-tokenizer-tmp\") pod \"666a701f-83d3-4366-a1b8-83f08a211715\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " Apr 24 21:30:14.485585 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:14.485564 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-tokenizer-uds\") pod \"666a701f-83d3-4366-a1b8-83f08a211715\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " Apr 24 21:30:14.485711 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:14.485609 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/666a701f-83d3-4366-a1b8-83f08a211715-tls-certs\") pod \"666a701f-83d3-4366-a1b8-83f08a211715\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " Apr 24 21:30:14.485711 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:14.485634 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkzbl\" (UniqueName: \"kubernetes.io/projected/666a701f-83d3-4366-a1b8-83f08a211715-kube-api-access-kkzbl\") pod \"666a701f-83d3-4366-a1b8-83f08a211715\" (UID: \"666a701f-83d3-4366-a1b8-83f08a211715\") " Apr 24 21:30:14.485823 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:14.485704 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "666a701f-83d3-4366-a1b8-83f08a211715" (UID: "666a701f-83d3-4366-a1b8-83f08a211715"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:30:14.485878 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:14.485860 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-tokenizer-cache\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:30:14.485932 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:14.485850 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "666a701f-83d3-4366-a1b8-83f08a211715" (UID: "666a701f-83d3-4366-a1b8-83f08a211715"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:30:14.485932 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:14.485923 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "666a701f-83d3-4366-a1b8-83f08a211715" (UID: "666a701f-83d3-4366-a1b8-83f08a211715"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:30:14.486247 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:14.486218 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "666a701f-83d3-4366-a1b8-83f08a211715" (UID: "666a701f-83d3-4366-a1b8-83f08a211715"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:30:14.487737 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:14.487716 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/666a701f-83d3-4366-a1b8-83f08a211715-kube-api-access-kkzbl" (OuterVolumeSpecName: "kube-api-access-kkzbl") pod "666a701f-83d3-4366-a1b8-83f08a211715" (UID: "666a701f-83d3-4366-a1b8-83f08a211715"). InnerVolumeSpecName "kube-api-access-kkzbl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:14.487841 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:14.487742 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/666a701f-83d3-4366-a1b8-83f08a211715-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "666a701f-83d3-4366-a1b8-83f08a211715" (UID: "666a701f-83d3-4366-a1b8-83f08a211715"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:14.586508 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:14.586482 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-tokenizer-tmp\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:30:14.586508 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:14.586506 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-tokenizer-uds\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:30:14.586656 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:14.586516 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/666a701f-83d3-4366-a1b8-83f08a211715-tls-certs\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:30:14.586656 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:14.586524 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kkzbl\" (UniqueName: \"kubernetes.io/projected/666a701f-83d3-4366-a1b8-83f08a211715-kube-api-access-kkzbl\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:30:14.586656 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:14.586533 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/666a701f-83d3-4366-a1b8-83f08a211715-kserve-provision-location\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:30:15.086469 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:15.086416 2581 generic.go:358] "Generic (PLEG): container finished" podID="666a701f-83d3-4366-a1b8-83f08a211715" containerID="84798a14d0f3e973e8f6152fb60f3fa07b503d3a51f4ae3e2d0915fe623c6b68" exitCode=0 Apr 24 21:30:15.086662 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:15.086494 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" event={"ID":"666a701f-83d3-4366-a1b8-83f08a211715","Type":"ContainerDied","Data":"84798a14d0f3e973e8f6152fb60f3fa07b503d3a51f4ae3e2d0915fe623c6b68"} Apr 24 21:30:15.086662 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:15.086520 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" event={"ID":"666a701f-83d3-4366-a1b8-83f08a211715","Type":"ContainerDied","Data":"cbb65b87fb6ba29325cd4c4b71e33c2e89ce2097604c301e14c66ef3e20f4c33"} Apr 24 21:30:15.086662 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:15.086534 2581 scope.go:117] "RemoveContainer" containerID="84798a14d0f3e973e8f6152fb60f3fa07b503d3a51f4ae3e2d0915fe623c6b68" Apr 24 21:30:15.086662 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:15.086539 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7" Apr 24 21:30:15.095523 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:15.095504 2581 scope.go:117] "RemoveContainer" containerID="ebacd73ef3e80111fabe9bd2ced9840b360354b5062a18509b869dce21d4e097" Apr 24 21:30:15.103563 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:15.103546 2581 scope.go:117] "RemoveContainer" containerID="e07d230b16ebb9b7b92706c5b57d9fe4dbdee1915a6e5ed982af13014dda7140" Apr 24 21:30:15.108635 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:15.108615 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7"] Apr 24 21:30:15.111031 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:15.111013 2581 scope.go:117] "RemoveContainer" containerID="84798a14d0f3e973e8f6152fb60f3fa07b503d3a51f4ae3e2d0915fe623c6b68" Apr 24 21:30:15.111270 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:30:15.111252 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84798a14d0f3e973e8f6152fb60f3fa07b503d3a51f4ae3e2d0915fe623c6b68\": container with ID starting with 84798a14d0f3e973e8f6152fb60f3fa07b503d3a51f4ae3e2d0915fe623c6b68 not found: ID does not exist" containerID="84798a14d0f3e973e8f6152fb60f3fa07b503d3a51f4ae3e2d0915fe623c6b68" Apr 24 21:30:15.111328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:15.111279 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84798a14d0f3e973e8f6152fb60f3fa07b503d3a51f4ae3e2d0915fe623c6b68"} err="failed to get container status \"84798a14d0f3e973e8f6152fb60f3fa07b503d3a51f4ae3e2d0915fe623c6b68\": rpc error: code = NotFound desc = could not find container \"84798a14d0f3e973e8f6152fb60f3fa07b503d3a51f4ae3e2d0915fe623c6b68\": container with ID starting with 84798a14d0f3e973e8f6152fb60f3fa07b503d3a51f4ae3e2d0915fe623c6b68 not found: ID does not exist" Apr 24 21:30:15.111328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:15.111300 2581 scope.go:117] "RemoveContainer" containerID="ebacd73ef3e80111fabe9bd2ced9840b360354b5062a18509b869dce21d4e097" Apr 24 21:30:15.111597 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:30:15.111572 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebacd73ef3e80111fabe9bd2ced9840b360354b5062a18509b869dce21d4e097\": container with ID starting with ebacd73ef3e80111fabe9bd2ced9840b360354b5062a18509b869dce21d4e097 not found: ID does not exist" containerID="ebacd73ef3e80111fabe9bd2ced9840b360354b5062a18509b869dce21d4e097" Apr 24 21:30:15.111671 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:15.111608 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebacd73ef3e80111fabe9bd2ced9840b360354b5062a18509b869dce21d4e097"} err="failed to get container status \"ebacd73ef3e80111fabe9bd2ced9840b360354b5062a18509b869dce21d4e097\": rpc error: code = NotFound desc = could not find container \"ebacd73ef3e80111fabe9bd2ced9840b360354b5062a18509b869dce21d4e097\": container with ID starting with ebacd73ef3e80111fabe9bd2ced9840b360354b5062a18509b869dce21d4e097 not found: ID does not exist" Apr 24 21:30:15.111671 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:15.111630 2581 scope.go:117] "RemoveContainer" containerID="e07d230b16ebb9b7b92706c5b57d9fe4dbdee1915a6e5ed982af13014dda7140" Apr 24 21:30:15.111936 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:30:15.111895 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e07d230b16ebb9b7b92706c5b57d9fe4dbdee1915a6e5ed982af13014dda7140\": container with ID starting with e07d230b16ebb9b7b92706c5b57d9fe4dbdee1915a6e5ed982af13014dda7140 not found: ID does not exist" containerID="e07d230b16ebb9b7b92706c5b57d9fe4dbdee1915a6e5ed982af13014dda7140" Apr 24 21:30:15.112051 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:15.111935 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e07d230b16ebb9b7b92706c5b57d9fe4dbdee1915a6e5ed982af13014dda7140"} err="failed to get container status \"e07d230b16ebb9b7b92706c5b57d9fe4dbdee1915a6e5ed982af13014dda7140\": rpc error: code = NotFound desc = could not find container \"e07d230b16ebb9b7b92706c5b57d9fe4dbdee1915a6e5ed982af13014dda7140\": container with ID starting with e07d230b16ebb9b7b92706c5b57d9fe4dbdee1915a6e5ed982af13014dda7140 not found: ID does not exist" Apr 24 21:30:15.113977 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:15.113956 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-76bc46kzj7"] Apr 24 21:30:15.874498 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:15.874458 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="666a701f-83d3-4366-a1b8-83f08a211715" path="/var/lib/kubelet/pods/666a701f-83d3-4366-a1b8-83f08a211715/volumes" Apr 24 21:30:17.394902 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.394869 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n"] Apr 24 21:30:17.395405 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.395385 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="666a701f-83d3-4366-a1b8-83f08a211715" containerName="storage-initializer" Apr 24 21:30:17.395532 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.395408 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="666a701f-83d3-4366-a1b8-83f08a211715" containerName="storage-initializer" Apr 24 21:30:17.395532 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.395443 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="666a701f-83d3-4366-a1b8-83f08a211715" containerName="tokenizer" Apr 24 21:30:17.395532 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.395454 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="666a701f-83d3-4366-a1b8-83f08a211715" containerName="tokenizer" Apr 24 21:30:17.395532 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.395485 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="666a701f-83d3-4366-a1b8-83f08a211715" containerName="main" Apr 24 21:30:17.395532 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.395494 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="666a701f-83d3-4366-a1b8-83f08a211715" containerName="main" Apr 24 21:30:17.395780 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.395630 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="666a701f-83d3-4366-a1b8-83f08a211715" containerName="main" Apr 24 21:30:17.395780 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.395653 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="666a701f-83d3-4366-a1b8-83f08a211715" containerName="tokenizer" Apr 24 21:30:17.400764 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.400743 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:17.403308 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.403286 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-t7b2b\"" Apr 24 21:30:17.403419 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.403313 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 24 21:30:17.426045 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.426011 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n"] Apr 24 21:30:17.509239 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.509202 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:17.509408 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.509245 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:17.509408 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.509274 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:17.509408 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.509358 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:17.509408 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.509377 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:17.509408 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.509398 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wls45\" (UniqueName: \"kubernetes.io/projected/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-kube-api-access-wls45\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:17.610537 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.610500 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:17.610537 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.610540 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:17.610743 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.610565 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:17.610743 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.610640 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:17.610743 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.610664 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:17.610743 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.610693 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wls45\" (UniqueName: \"kubernetes.io/projected/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-kube-api-access-wls45\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:17.610917 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.610896 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:17.610956 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.610930 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:17.611096 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.611080 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:17.611134 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.611081 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:17.613106 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.613086 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:17.621302 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.621276 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wls45\" (UniqueName: \"kubernetes.io/projected/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-kube-api-access-wls45\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:17.711736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.711708 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:17.858374 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:17.858336 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n"] Apr 24 21:30:17.861452 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:30:17.861391 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f0aa52b_f02b_45fe_bd05_ca6bb9ddcf06.slice/crio-14a8e7db35e8f8a750822d1931f6c1523b3dbfbec6b71920b2d3ffdc6242b9c2 WatchSource:0}: Error finding container 14a8e7db35e8f8a750822d1931f6c1523b3dbfbec6b71920b2d3ffdc6242b9c2: Status 404 returned error can't find the container with id 14a8e7db35e8f8a750822d1931f6c1523b3dbfbec6b71920b2d3ffdc6242b9c2 Apr 24 21:30:18.099577 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:18.099486 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" event={"ID":"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06","Type":"ContainerStarted","Data":"a7f830888127a1bebc6e88437ea0c8fcd41307f2ccf6aded42e72331e64321e8"} Apr 24 21:30:18.099577 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:18.099525 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" event={"ID":"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06","Type":"ContainerStarted","Data":"14a8e7db35e8f8a750822d1931f6c1523b3dbfbec6b71920b2d3ffdc6242b9c2"} Apr 24 21:30:19.104103 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:19.104010 2581 generic.go:358] "Generic (PLEG): container finished" podID="4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06" containerID="a7f830888127a1bebc6e88437ea0c8fcd41307f2ccf6aded42e72331e64321e8" exitCode=0 Apr 24 21:30:19.104103 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:19.104074 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" event={"ID":"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06","Type":"ContainerDied","Data":"a7f830888127a1bebc6e88437ea0c8fcd41307f2ccf6aded42e72331e64321e8"} Apr 24 21:30:20.108952 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:20.108917 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" event={"ID":"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06","Type":"ContainerStarted","Data":"f314ed84ac76143fa0a77480c45fa5c66a905892a51bc5801b27b9e8594caf0b"} Apr 24 21:30:20.108952 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:20.108954 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" event={"ID":"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06","Type":"ContainerStarted","Data":"62dce807e5617280a73f7b4ddda959134154d8ad2ce3919749d0591551c10802"} Apr 24 21:30:20.109404 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:20.109082 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:20.130214 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:20.130150 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" podStartSLOduration=3.130134075 podStartE2EDuration="3.130134075s" podCreationTimestamp="2026-04-24 21:30:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:30:20.127813042 +0000 UTC m=+860.806649726" watchObservedRunningTime="2026-04-24 21:30:20.130134075 +0000 UTC m=+860.808970748" Apr 24 21:30:27.712801 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:27.712758 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:27.712801 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:27.712811 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:27.715597 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:27.715569 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:28.136544 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:28.136457 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:49.140596 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:49.140509 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:30:59.840886 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:59.840856 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/1.log" Apr 24 21:30:59.841400 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:59.841063 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/1.log" Apr 24 21:30:59.846382 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:59.846361 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 21:30:59.846508 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:30:59.846483 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 21:33:12.695226 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:12.695192 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx"] Apr 24 21:33:12.695810 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:12.695612 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" podUID="63a47288-d4a7-4ef4-b21b-8744f2a58afe" containerName="main" containerID="cri-o://161a654396d412ba7ac2082b5cd4be476754220a5f0d568615e730067cecc3d3" gracePeriod=30 Apr 24 21:33:12.695810 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:12.695687 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" podUID="63a47288-d4a7-4ef4-b21b-8744f2a58afe" containerName="tokenizer" containerID="cri-o://2c9c152cae51f900aaf6d70ec5f72998e25457ccb232c7c124dd8fe559696ca7" gracePeriod=30 Apr 24 21:33:13.718258 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.718227 2581 generic.go:358] "Generic (PLEG): container finished" podID="63a47288-d4a7-4ef4-b21b-8744f2a58afe" containerID="2c9c152cae51f900aaf6d70ec5f72998e25457ccb232c7c124dd8fe559696ca7" exitCode=0 Apr 24 21:33:13.718258 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.718257 2581 generic.go:358] "Generic (PLEG): container finished" podID="63a47288-d4a7-4ef4-b21b-8744f2a58afe" containerID="161a654396d412ba7ac2082b5cd4be476754220a5f0d568615e730067cecc3d3" exitCode=0 Apr 24 21:33:13.718639 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.718280 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" event={"ID":"63a47288-d4a7-4ef4-b21b-8744f2a58afe","Type":"ContainerDied","Data":"2c9c152cae51f900aaf6d70ec5f72998e25457ccb232c7c124dd8fe559696ca7"} Apr 24 21:33:13.718639 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.718317 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" event={"ID":"63a47288-d4a7-4ef4-b21b-8744f2a58afe","Type":"ContainerDied","Data":"161a654396d412ba7ac2082b5cd4be476754220a5f0d568615e730067cecc3d3"} Apr 24 21:33:13.848669 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.848640 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:33:13.871749 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.871718 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tokenizer-cache\") pod \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " Apr 24 21:33:13.871885 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.871763 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tokenizer-uds\") pod \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " Apr 24 21:33:13.871885 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.871802 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z8d9\" (UniqueName: \"kubernetes.io/projected/63a47288-d4a7-4ef4-b21b-8744f2a58afe-kube-api-access-6z8d9\") pod \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " Apr 24 21:33:13.871885 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.871832 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-kserve-provision-location\") pod \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " Apr 24 21:33:13.872052 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.871889 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tokenizer-tmp\") pod \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " Apr 24 21:33:13.872052 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.871917 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tls-certs\") pod \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\" (UID: \"63a47288-d4a7-4ef4-b21b-8744f2a58afe\") " Apr 24 21:33:13.872052 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.871998 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "63a47288-d4a7-4ef4-b21b-8744f2a58afe" (UID: "63a47288-d4a7-4ef4-b21b-8744f2a58afe"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:13.872206 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.872093 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "63a47288-d4a7-4ef4-b21b-8744f2a58afe" (UID: "63a47288-d4a7-4ef4-b21b-8744f2a58afe"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:13.872206 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.872200 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tokenizer-cache\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:33:13.872303 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.872218 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tokenizer-uds\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:33:13.872554 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.872517 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "63a47288-d4a7-4ef4-b21b-8744f2a58afe" (UID: "63a47288-d4a7-4ef4-b21b-8744f2a58afe"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:13.872824 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.872792 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "63a47288-d4a7-4ef4-b21b-8744f2a58afe" (UID: "63a47288-d4a7-4ef4-b21b-8744f2a58afe"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:13.874170 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.874133 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63a47288-d4a7-4ef4-b21b-8744f2a58afe-kube-api-access-6z8d9" (OuterVolumeSpecName: "kube-api-access-6z8d9") pod "63a47288-d4a7-4ef4-b21b-8744f2a58afe" (UID: "63a47288-d4a7-4ef4-b21b-8744f2a58afe"). InnerVolumeSpecName "kube-api-access-6z8d9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:33:13.874508 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.874482 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "63a47288-d4a7-4ef4-b21b-8744f2a58afe" (UID: "63a47288-d4a7-4ef4-b21b-8744f2a58afe"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:13.972626 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.972600 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6z8d9\" (UniqueName: \"kubernetes.io/projected/63a47288-d4a7-4ef4-b21b-8744f2a58afe-kube-api-access-6z8d9\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:33:13.972626 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.972623 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-kserve-provision-location\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:33:13.972781 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.972634 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tokenizer-tmp\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:33:13.972781 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:13.972644 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/63a47288-d4a7-4ef4-b21b-8744f2a58afe-tls-certs\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:33:14.724185 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:14.724160 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" Apr 24 21:33:14.724616 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:14.724156 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx" event={"ID":"63a47288-d4a7-4ef4-b21b-8744f2a58afe","Type":"ContainerDied","Data":"511c9cd55779b00d41cd172fcbdd7d1bcd0544b61b4904825d40ec96f3a27630"} Apr 24 21:33:14.724616 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:14.724292 2581 scope.go:117] "RemoveContainer" containerID="2c9c152cae51f900aaf6d70ec5f72998e25457ccb232c7c124dd8fe559696ca7" Apr 24 21:33:14.733038 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:14.733022 2581 scope.go:117] "RemoveContainer" containerID="161a654396d412ba7ac2082b5cd4be476754220a5f0d568615e730067cecc3d3" Apr 24 21:33:14.740122 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:14.740107 2581 scope.go:117] "RemoveContainer" containerID="a1506101aed5376666b25784ddde934617daed3ba67aaa6099fcae5f4974a2a7" Apr 24 21:33:14.745219 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:14.745199 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx"] Apr 24 21:33:14.748514 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:14.748497 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevg6wx"] Apr 24 21:33:15.873738 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:15.873705 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63a47288-d4a7-4ef4-b21b-8744f2a58afe" path="/var/lib/kubelet/pods/63a47288-d4a7-4ef4-b21b-8744f2a58afe/volumes" Apr 24 21:33:27.009505 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.009475 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75"] Apr 24 21:33:27.010033 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.009996 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63a47288-d4a7-4ef4-b21b-8744f2a58afe" containerName="main" Apr 24 21:33:27.010033 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.010015 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="63a47288-d4a7-4ef4-b21b-8744f2a58afe" containerName="main" Apr 24 21:33:27.010154 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.010047 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63a47288-d4a7-4ef4-b21b-8744f2a58afe" containerName="tokenizer" Apr 24 21:33:27.010154 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.010056 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="63a47288-d4a7-4ef4-b21b-8744f2a58afe" containerName="tokenizer" Apr 24 21:33:27.010154 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.010079 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63a47288-d4a7-4ef4-b21b-8744f2a58afe" containerName="storage-initializer" Apr 24 21:33:27.010154 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.010088 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="63a47288-d4a7-4ef4-b21b-8744f2a58afe" containerName="storage-initializer" Apr 24 21:33:27.010355 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.010185 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="63a47288-d4a7-4ef4-b21b-8744f2a58afe" containerName="main" Apr 24 21:33:27.010355 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.010198 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="63a47288-d4a7-4ef4-b21b-8744f2a58afe" containerName="tokenizer" Apr 24 21:33:27.013461 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.013441 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:27.017571 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.017552 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 24 21:33:27.017688 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.017551 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-4v7w5\"" Apr 24 21:33:27.034753 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.034728 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75"] Apr 24 21:33:27.078713 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.078685 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:27.078864 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.078735 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:27.078864 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.078797 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9hbq\" (UniqueName: \"kubernetes.io/projected/e52a4f4c-1241-46cd-92c5-45b25cb5d861-kube-api-access-d9hbq\") pod \"custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:27.078864 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.078848 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:27.079010 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.078887 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:27.079010 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.078934 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:27.180366 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.180334 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:27.180518 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.180374 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:27.180518 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.180435 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:27.180518 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.180467 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:27.180671 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.180514 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d9hbq\" (UniqueName: \"kubernetes.io/projected/e52a4f4c-1241-46cd-92c5-45b25cb5d861-kube-api-access-d9hbq\") pod \"custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:27.180671 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.180568 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:27.180824 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.180804 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:27.180885 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.180852 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:27.180885 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.180870 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:27.180954 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.180897 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:27.183019 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.182989 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:27.188965 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.188945 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9hbq\" (UniqueName: \"kubernetes.io/projected/e52a4f4c-1241-46cd-92c5-45b25cb5d861-kube-api-access-d9hbq\") pod \"custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:27.324161 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.324086 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:27.459390 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.459362 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75"] Apr 24 21:33:27.462574 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:33:27.462547 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode52a4f4c_1241_46cd_92c5_45b25cb5d861.slice/crio-f88c52b6c7bc9b248f3c3e4a3ece0dc1dfb5d9513effc28897933d6bdc5a403a WatchSource:0}: Error finding container f88c52b6c7bc9b248f3c3e4a3ece0dc1dfb5d9513effc28897933d6bdc5a403a: Status 404 returned error can't find the container with id f88c52b6c7bc9b248f3c3e4a3ece0dc1dfb5d9513effc28897933d6bdc5a403a Apr 24 21:33:27.773001 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.772964 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" event={"ID":"e52a4f4c-1241-46cd-92c5-45b25cb5d861","Type":"ContainerStarted","Data":"0f12f37af5745c9cef434122d33e82358f177150bbccdf3323c60d42b04bdc08"} Apr 24 21:33:27.773001 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:27.772998 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" event={"ID":"e52a4f4c-1241-46cd-92c5-45b25cb5d861","Type":"ContainerStarted","Data":"f88c52b6c7bc9b248f3c3e4a3ece0dc1dfb5d9513effc28897933d6bdc5a403a"} Apr 24 21:33:28.778069 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:28.778033 2581 generic.go:358] "Generic (PLEG): container finished" podID="e52a4f4c-1241-46cd-92c5-45b25cb5d861" containerID="0f12f37af5745c9cef434122d33e82358f177150bbccdf3323c60d42b04bdc08" exitCode=0 Apr 24 21:33:28.778467 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:28.778118 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" event={"ID":"e52a4f4c-1241-46cd-92c5-45b25cb5d861","Type":"ContainerDied","Data":"0f12f37af5745c9cef434122d33e82358f177150bbccdf3323c60d42b04bdc08"} Apr 24 21:33:29.783050 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:29.783016 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" event={"ID":"e52a4f4c-1241-46cd-92c5-45b25cb5d861","Type":"ContainerStarted","Data":"e76532cb24ff36eec3aac3207e565b18fdf1f22ddd9e672d7df07267b60ca9d1"} Apr 24 21:33:29.783050 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:29.783053 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" event={"ID":"e52a4f4c-1241-46cd-92c5-45b25cb5d861","Type":"ContainerStarted","Data":"835d878a5594a2a8f8d26db6d557498c12c1a6a0439307ebf6a832e6cacbb7fa"} Apr 24 21:33:29.783616 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:29.783111 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:29.804563 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:29.804511 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" podStartSLOduration=3.804494374 podStartE2EDuration="3.804494374s" podCreationTimestamp="2026-04-24 21:33:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:33:29.802397004 +0000 UTC m=+1050.481233700" watchObservedRunningTime="2026-04-24 21:33:29.804494374 +0000 UTC m=+1050.483331048" Apr 24 21:33:37.324403 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:37.324370 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:37.324403 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:37.324409 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:37.327008 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:37.326984 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:37.821565 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:37.821536 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:33:59.828250 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:33:59.828172 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:35:12.585800 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:12.585764 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75"] Apr 24 21:35:12.586383 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:12.586079 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" podUID="e52a4f4c-1241-46cd-92c5-45b25cb5d861" containerName="main" containerID="cri-o://835d878a5594a2a8f8d26db6d557498c12c1a6a0439307ebf6a832e6cacbb7fa" gracePeriod=30 Apr 24 21:35:12.586383 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:12.586118 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" podUID="e52a4f4c-1241-46cd-92c5-45b25cb5d861" containerName="tokenizer" containerID="cri-o://e76532cb24ff36eec3aac3207e565b18fdf1f22ddd9e672d7df07267b60ca9d1" gracePeriod=30 Apr 24 21:35:13.160609 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:13.160571 2581 generic.go:358] "Generic (PLEG): container finished" podID="e52a4f4c-1241-46cd-92c5-45b25cb5d861" containerID="835d878a5594a2a8f8d26db6d557498c12c1a6a0439307ebf6a832e6cacbb7fa" exitCode=0 Apr 24 21:35:13.160786 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:13.160653 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" event={"ID":"e52a4f4c-1241-46cd-92c5-45b25cb5d861","Type":"ContainerDied","Data":"835d878a5594a2a8f8d26db6d557498c12c1a6a0439307ebf6a832e6cacbb7fa"} Apr 24 21:35:13.748867 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:13.748843 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:35:13.775538 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:13.775512 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tokenizer-uds\") pod \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " Apr 24 21:35:13.775673 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:13.775601 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tokenizer-cache\") pod \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " Apr 24 21:35:13.775673 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:13.775630 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tokenizer-tmp\") pod \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " Apr 24 21:35:13.775673 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:13.775646 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-kserve-provision-location\") pod \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " Apr 24 21:35:13.775832 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:13.775693 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9hbq\" (UniqueName: \"kubernetes.io/projected/e52a4f4c-1241-46cd-92c5-45b25cb5d861-kube-api-access-d9hbq\") pod \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " Apr 24 21:35:13.775832 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:13.775719 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tls-certs\") pod \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\" (UID: \"e52a4f4c-1241-46cd-92c5-45b25cb5d861\") " Apr 24 21:35:13.775832 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:13.775797 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "e52a4f4c-1241-46cd-92c5-45b25cb5d861" (UID: "e52a4f4c-1241-46cd-92c5-45b25cb5d861"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:35:13.775989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:13.775847 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "e52a4f4c-1241-46cd-92c5-45b25cb5d861" (UID: "e52a4f4c-1241-46cd-92c5-45b25cb5d861"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:35:13.775989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:13.775935 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "e52a4f4c-1241-46cd-92c5-45b25cb5d861" (UID: "e52a4f4c-1241-46cd-92c5-45b25cb5d861"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:35:13.776096 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:13.776015 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tokenizer-cache\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:35:13.776096 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:13.776034 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tokenizer-tmp\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:35:13.776096 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:13.776047 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tokenizer-uds\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:35:13.776784 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:13.776749 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e52a4f4c-1241-46cd-92c5-45b25cb5d861" (UID: "e52a4f4c-1241-46cd-92c5-45b25cb5d861"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:35:13.777967 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:13.777946 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e52a4f4c-1241-46cd-92c5-45b25cb5d861" (UID: "e52a4f4c-1241-46cd-92c5-45b25cb5d861"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:35:13.778038 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:13.778003 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e52a4f4c-1241-46cd-92c5-45b25cb5d861-kube-api-access-d9hbq" (OuterVolumeSpecName: "kube-api-access-d9hbq") pod "e52a4f4c-1241-46cd-92c5-45b25cb5d861" (UID: "e52a4f4c-1241-46cd-92c5-45b25cb5d861"). InnerVolumeSpecName "kube-api-access-d9hbq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:35:13.876434 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:13.876353 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e52a4f4c-1241-46cd-92c5-45b25cb5d861-kserve-provision-location\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:35:13.876434 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:13.876377 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d9hbq\" (UniqueName: \"kubernetes.io/projected/e52a4f4c-1241-46cd-92c5-45b25cb5d861-kube-api-access-d9hbq\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:35:13.876434 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:13.876389 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e52a4f4c-1241-46cd-92c5-45b25cb5d861-tls-certs\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:35:14.166140 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:14.166047 2581 generic.go:358] "Generic (PLEG): container finished" podID="e52a4f4c-1241-46cd-92c5-45b25cb5d861" containerID="e76532cb24ff36eec3aac3207e565b18fdf1f22ddd9e672d7df07267b60ca9d1" exitCode=0 Apr 24 21:35:14.166140 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:14.166124 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" Apr 24 21:35:14.166388 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:14.166122 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" event={"ID":"e52a4f4c-1241-46cd-92c5-45b25cb5d861","Type":"ContainerDied","Data":"e76532cb24ff36eec3aac3207e565b18fdf1f22ddd9e672d7df07267b60ca9d1"} Apr 24 21:35:14.166388 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:14.166233 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75" event={"ID":"e52a4f4c-1241-46cd-92c5-45b25cb5d861","Type":"ContainerDied","Data":"f88c52b6c7bc9b248f3c3e4a3ece0dc1dfb5d9513effc28897933d6bdc5a403a"} Apr 24 21:35:14.166388 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:14.166257 2581 scope.go:117] "RemoveContainer" containerID="e76532cb24ff36eec3aac3207e565b18fdf1f22ddd9e672d7df07267b60ca9d1" Apr 24 21:35:14.175168 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:14.175148 2581 scope.go:117] "RemoveContainer" containerID="835d878a5594a2a8f8d26db6d557498c12c1a6a0439307ebf6a832e6cacbb7fa" Apr 24 21:35:14.185470 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:14.183872 2581 scope.go:117] "RemoveContainer" containerID="0f12f37af5745c9cef434122d33e82358f177150bbccdf3323c60d42b04bdc08" Apr 24 21:35:14.193209 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:14.193183 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75"] Apr 24 21:35:14.194798 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:14.194774 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-84dd4b7fd7h75"] Apr 24 21:35:14.195597 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:14.195571 2581 scope.go:117] "RemoveContainer" containerID="e76532cb24ff36eec3aac3207e565b18fdf1f22ddd9e672d7df07267b60ca9d1" Apr 24 21:35:14.195847 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:35:14.195829 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e76532cb24ff36eec3aac3207e565b18fdf1f22ddd9e672d7df07267b60ca9d1\": container with ID starting with e76532cb24ff36eec3aac3207e565b18fdf1f22ddd9e672d7df07267b60ca9d1 not found: ID does not exist" containerID="e76532cb24ff36eec3aac3207e565b18fdf1f22ddd9e672d7df07267b60ca9d1" Apr 24 21:35:14.195902 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:14.195855 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e76532cb24ff36eec3aac3207e565b18fdf1f22ddd9e672d7df07267b60ca9d1"} err="failed to get container status \"e76532cb24ff36eec3aac3207e565b18fdf1f22ddd9e672d7df07267b60ca9d1\": rpc error: code = NotFound desc = could not find container \"e76532cb24ff36eec3aac3207e565b18fdf1f22ddd9e672d7df07267b60ca9d1\": container with ID starting with e76532cb24ff36eec3aac3207e565b18fdf1f22ddd9e672d7df07267b60ca9d1 not found: ID does not exist" Apr 24 21:35:14.195902 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:14.195873 2581 scope.go:117] "RemoveContainer" containerID="835d878a5594a2a8f8d26db6d557498c12c1a6a0439307ebf6a832e6cacbb7fa" Apr 24 21:35:14.196137 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:35:14.196117 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"835d878a5594a2a8f8d26db6d557498c12c1a6a0439307ebf6a832e6cacbb7fa\": container with ID starting with 835d878a5594a2a8f8d26db6d557498c12c1a6a0439307ebf6a832e6cacbb7fa not found: ID does not exist" containerID="835d878a5594a2a8f8d26db6d557498c12c1a6a0439307ebf6a832e6cacbb7fa" Apr 24 21:35:14.196183 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:14.196145 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"835d878a5594a2a8f8d26db6d557498c12c1a6a0439307ebf6a832e6cacbb7fa"} err="failed to get container status \"835d878a5594a2a8f8d26db6d557498c12c1a6a0439307ebf6a832e6cacbb7fa\": rpc error: code = NotFound desc = could not find container \"835d878a5594a2a8f8d26db6d557498c12c1a6a0439307ebf6a832e6cacbb7fa\": container with ID starting with 835d878a5594a2a8f8d26db6d557498c12c1a6a0439307ebf6a832e6cacbb7fa not found: ID does not exist" Apr 24 21:35:14.196183 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:14.196162 2581 scope.go:117] "RemoveContainer" containerID="0f12f37af5745c9cef434122d33e82358f177150bbccdf3323c60d42b04bdc08" Apr 24 21:35:14.196337 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:35:14.196322 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f12f37af5745c9cef434122d33e82358f177150bbccdf3323c60d42b04bdc08\": container with ID starting with 0f12f37af5745c9cef434122d33e82358f177150bbccdf3323c60d42b04bdc08 not found: ID does not exist" containerID="0f12f37af5745c9cef434122d33e82358f177150bbccdf3323c60d42b04bdc08" Apr 24 21:35:14.196391 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:14.196340 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f12f37af5745c9cef434122d33e82358f177150bbccdf3323c60d42b04bdc08"} err="failed to get container status \"0f12f37af5745c9cef434122d33e82358f177150bbccdf3323c60d42b04bdc08\": rpc error: code = NotFound desc = could not find container \"0f12f37af5745c9cef434122d33e82358f177150bbccdf3323c60d42b04bdc08\": container with ID starting with 0f12f37af5745c9cef434122d33e82358f177150bbccdf3323c60d42b04bdc08 not found: ID does not exist" Apr 24 21:35:15.874978 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:15.874944 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e52a4f4c-1241-46cd-92c5-45b25cb5d861" path="/var/lib/kubelet/pods/e52a4f4c-1241-46cd-92c5-45b25cb5d861/volumes" Apr 24 21:35:17.866193 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:17.866156 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz"] Apr 24 21:35:17.866628 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:17.866536 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e52a4f4c-1241-46cd-92c5-45b25cb5d861" containerName="main" Apr 24 21:35:17.866628 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:17.866548 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="e52a4f4c-1241-46cd-92c5-45b25cb5d861" containerName="main" Apr 24 21:35:17.866628 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:17.866569 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e52a4f4c-1241-46cd-92c5-45b25cb5d861" containerName="tokenizer" Apr 24 21:35:17.866628 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:17.866574 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="e52a4f4c-1241-46cd-92c5-45b25cb5d861" containerName="tokenizer" Apr 24 21:35:17.866628 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:17.866583 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e52a4f4c-1241-46cd-92c5-45b25cb5d861" containerName="storage-initializer" Apr 24 21:35:17.866628 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:17.866589 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="e52a4f4c-1241-46cd-92c5-45b25cb5d861" containerName="storage-initializer" Apr 24 21:35:17.866885 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:17.866651 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="e52a4f4c-1241-46cd-92c5-45b25cb5d861" containerName="tokenizer" Apr 24 21:35:17.866885 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:17.866662 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="e52a4f4c-1241-46cd-92c5-45b25cb5d861" containerName="main" Apr 24 21:35:17.871803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:17.871776 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:17.874103 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:17.874045 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-6ls6r\"" Apr 24 21:35:17.874252 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:17.874138 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 24 21:35:17.879162 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:17.879138 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz"] Apr 24 21:35:17.903684 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:17.903656 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txdkk\" (UniqueName: \"kubernetes.io/projected/77ee4087-b150-4fbe-837a-e7c99d43f065-kube-api-access-txdkk\") pod \"router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:17.903831 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:17.903693 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/77ee4087-b150-4fbe-837a-e7c99d43f065-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:17.903831 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:17.903751 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:17.903831 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:17.903788 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:17.903941 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:17.903831 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:17.903976 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:17.903945 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:18.004866 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:18.004834 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:18.004866 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:18.004871 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:18.005093 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:18.004891 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:18.005093 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:18.004962 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:18.005093 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:18.004999 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-txdkk\" (UniqueName: \"kubernetes.io/projected/77ee4087-b150-4fbe-837a-e7c99d43f065-kube-api-access-txdkk\") pod \"router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:18.005093 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:18.005030 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/77ee4087-b150-4fbe-837a-e7c99d43f065-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:18.005292 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:18.005193 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:18.005292 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:18.005282 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:18.005386 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:18.005334 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:18.005386 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:18.005363 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:18.007516 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:18.007495 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/77ee4087-b150-4fbe-837a-e7c99d43f065-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:18.013827 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:18.013799 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-txdkk\" (UniqueName: \"kubernetes.io/projected/77ee4087-b150-4fbe-837a-e7c99d43f065-kube-api-access-txdkk\") pod \"router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:18.182674 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:18.182639 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:18.312573 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:18.312455 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz"] Apr 24 21:35:18.314868 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:35:18.314837 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77ee4087_b150_4fbe_837a_e7c99d43f065.slice/crio-07c6d7a2029653f0058a01f2882659553ec2110b329775a0537024016137a25c WatchSource:0}: Error finding container 07c6d7a2029653f0058a01f2882659553ec2110b329775a0537024016137a25c: Status 404 returned error can't find the container with id 07c6d7a2029653f0058a01f2882659553ec2110b329775a0537024016137a25c Apr 24 21:35:18.316841 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:18.316822 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:35:19.187502 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:19.187469 2581 generic.go:358] "Generic (PLEG): container finished" podID="77ee4087-b150-4fbe-837a-e7c99d43f065" containerID="4908ef1ad528b5204a1735f5172a44984099aa591232a6aed1e8df801d8ba2ba" exitCode=0 Apr 24 21:35:19.187821 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:19.187555 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" event={"ID":"77ee4087-b150-4fbe-837a-e7c99d43f065","Type":"ContainerDied","Data":"4908ef1ad528b5204a1735f5172a44984099aa591232a6aed1e8df801d8ba2ba"} Apr 24 21:35:19.187821 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:19.187585 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" event={"ID":"77ee4087-b150-4fbe-837a-e7c99d43f065","Type":"ContainerStarted","Data":"07c6d7a2029653f0058a01f2882659553ec2110b329775a0537024016137a25c"} Apr 24 21:35:20.196489 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:20.196451 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" event={"ID":"77ee4087-b150-4fbe-837a-e7c99d43f065","Type":"ContainerStarted","Data":"01790edfbb1978e0a5394e6550bb1b58cc6c574b15fb92a9656130de36122af8"} Apr 24 21:35:20.196489 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:20.196491 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" event={"ID":"77ee4087-b150-4fbe-837a-e7c99d43f065","Type":"ContainerStarted","Data":"274d9aae0165c7fb4670b209f91ba4cb4fee7d0bb7f19bb8aeedac8756b1d8d0"} Apr 24 21:35:20.196916 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:20.196559 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:20.219103 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:20.219059 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" podStartSLOduration=3.219045745 podStartE2EDuration="3.219045745s" podCreationTimestamp="2026-04-24 21:35:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:35:20.217700842 +0000 UTC m=+1160.896537516" watchObservedRunningTime="2026-04-24 21:35:20.219045745 +0000 UTC m=+1160.897882417" Apr 24 21:35:28.182786 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:28.182745 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:28.182786 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:28.182792 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:28.185603 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:28.185577 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:28.229327 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:28.229299 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:49.234631 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:49.234602 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:35:59.873593 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:59.873563 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/1.log" Apr 24 21:35:59.874381 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:59.874360 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/1.log" Apr 24 21:35:59.879919 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:59.879897 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 21:35:59.880515 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:35:59.880495 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 21:37:03.403159 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:03.403079 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz"] Apr 24 21:37:03.405710 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:03.403507 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" podUID="77ee4087-b150-4fbe-837a-e7c99d43f065" containerName="main" containerID="cri-o://274d9aae0165c7fb4670b209f91ba4cb4fee7d0bb7f19bb8aeedac8756b1d8d0" gracePeriod=30 Apr 24 21:37:03.405710 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:03.403591 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" podUID="77ee4087-b150-4fbe-837a-e7c99d43f065" containerName="tokenizer" containerID="cri-o://01790edfbb1978e0a5394e6550bb1b58cc6c574b15fb92a9656130de36122af8" gracePeriod=30 Apr 24 21:37:03.570506 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:03.570470 2581 generic.go:358] "Generic (PLEG): container finished" podID="77ee4087-b150-4fbe-837a-e7c99d43f065" containerID="274d9aae0165c7fb4670b209f91ba4cb4fee7d0bb7f19bb8aeedac8756b1d8d0" exitCode=0 Apr 24 21:37:03.570695 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:03.570522 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" event={"ID":"77ee4087-b150-4fbe-837a-e7c99d43f065","Type":"ContainerDied","Data":"274d9aae0165c7fb4670b209f91ba4cb4fee7d0bb7f19bb8aeedac8756b1d8d0"} Apr 24 21:37:04.577355 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:04.577322 2581 generic.go:358] "Generic (PLEG): container finished" podID="77ee4087-b150-4fbe-837a-e7c99d43f065" containerID="01790edfbb1978e0a5394e6550bb1b58cc6c574b15fb92a9656130de36122af8" exitCode=0 Apr 24 21:37:04.577690 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:04.577391 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" event={"ID":"77ee4087-b150-4fbe-837a-e7c99d43f065","Type":"ContainerDied","Data":"01790edfbb1978e0a5394e6550bb1b58cc6c574b15fb92a9656130de36122af8"} Apr 24 21:37:04.659784 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:04.659761 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:37:04.720989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:04.720910 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/77ee4087-b150-4fbe-837a-e7c99d43f065-tls-certs\") pod \"77ee4087-b150-4fbe-837a-e7c99d43f065\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " Apr 24 21:37:04.720989 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:04.720955 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-kserve-provision-location\") pod \"77ee4087-b150-4fbe-837a-e7c99d43f065\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " Apr 24 21:37:04.721239 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:04.720989 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-tokenizer-tmp\") pod \"77ee4087-b150-4fbe-837a-e7c99d43f065\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " Apr 24 21:37:04.721239 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:04.721038 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-tokenizer-uds\") pod \"77ee4087-b150-4fbe-837a-e7c99d43f065\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " Apr 24 21:37:04.721239 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:04.721088 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txdkk\" (UniqueName: \"kubernetes.io/projected/77ee4087-b150-4fbe-837a-e7c99d43f065-kube-api-access-txdkk\") pod \"77ee4087-b150-4fbe-837a-e7c99d43f065\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " Apr 24 21:37:04.721239 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:04.721110 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-tokenizer-cache\") pod \"77ee4087-b150-4fbe-837a-e7c99d43f065\" (UID: \"77ee4087-b150-4fbe-837a-e7c99d43f065\") " Apr 24 21:37:04.721458 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:04.721319 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "77ee4087-b150-4fbe-837a-e7c99d43f065" (UID: "77ee4087-b150-4fbe-837a-e7c99d43f065"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:37:04.721458 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:04.721445 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-tokenizer-uds\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:37:04.721568 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:04.721478 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "77ee4087-b150-4fbe-837a-e7c99d43f065" (UID: "77ee4087-b150-4fbe-837a-e7c99d43f065"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:37:04.721568 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:04.721479 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "77ee4087-b150-4fbe-837a-e7c99d43f065" (UID: "77ee4087-b150-4fbe-837a-e7c99d43f065"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:37:04.722134 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:04.722098 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "77ee4087-b150-4fbe-837a-e7c99d43f065" (UID: "77ee4087-b150-4fbe-837a-e7c99d43f065"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:37:04.723465 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:04.723415 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77ee4087-b150-4fbe-837a-e7c99d43f065-kube-api-access-txdkk" (OuterVolumeSpecName: "kube-api-access-txdkk") pod "77ee4087-b150-4fbe-837a-e7c99d43f065" (UID: "77ee4087-b150-4fbe-837a-e7c99d43f065"). InnerVolumeSpecName "kube-api-access-txdkk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:37:04.723961 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:04.723938 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77ee4087-b150-4fbe-837a-e7c99d43f065-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "77ee4087-b150-4fbe-837a-e7c99d43f065" (UID: "77ee4087-b150-4fbe-837a-e7c99d43f065"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:37:04.822687 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:04.822649 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-txdkk\" (UniqueName: \"kubernetes.io/projected/77ee4087-b150-4fbe-837a-e7c99d43f065-kube-api-access-txdkk\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:37:04.822687 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:04.822684 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-tokenizer-cache\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:37:04.822891 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:04.822697 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/77ee4087-b150-4fbe-837a-e7c99d43f065-tls-certs\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:37:04.822891 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:04.822711 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-kserve-provision-location\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:37:04.822891 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:04.822724 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/77ee4087-b150-4fbe-837a-e7c99d43f065-tokenizer-tmp\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:37:05.582969 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:05.582939 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" Apr 24 21:37:05.583475 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:05.582938 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz" event={"ID":"77ee4087-b150-4fbe-837a-e7c99d43f065","Type":"ContainerDied","Data":"07c6d7a2029653f0058a01f2882659553ec2110b329775a0537024016137a25c"} Apr 24 21:37:05.583475 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:05.583057 2581 scope.go:117] "RemoveContainer" containerID="01790edfbb1978e0a5394e6550bb1b58cc6c574b15fb92a9656130de36122af8" Apr 24 21:37:05.593556 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:05.593533 2581 scope.go:117] "RemoveContainer" containerID="274d9aae0165c7fb4670b209f91ba4cb4fee7d0bb7f19bb8aeedac8756b1d8d0" Apr 24 21:37:05.600816 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:05.600798 2581 scope.go:117] "RemoveContainer" containerID="4908ef1ad528b5204a1735f5172a44984099aa591232a6aed1e8df801d8ba2ba" Apr 24 21:37:05.608191 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:05.608167 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz"] Apr 24 21:37:05.612285 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:05.612258 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8489bffb54-n6hqz"] Apr 24 21:37:05.873659 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:05.873581 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77ee4087-b150-4fbe-837a-e7c99d43f065" path="/var/lib/kubelet/pods/77ee4087-b150-4fbe-837a-e7c99d43f065/volumes" Apr 24 21:37:19.868274 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:19.868241 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x"] Apr 24 21:37:19.868988 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:19.868625 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77ee4087-b150-4fbe-837a-e7c99d43f065" containerName="tokenizer" Apr 24 21:37:19.868988 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:19.868636 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ee4087-b150-4fbe-837a-e7c99d43f065" containerName="tokenizer" Apr 24 21:37:19.868988 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:19.868651 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77ee4087-b150-4fbe-837a-e7c99d43f065" containerName="main" Apr 24 21:37:19.868988 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:19.868656 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ee4087-b150-4fbe-837a-e7c99d43f065" containerName="main" Apr 24 21:37:19.868988 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:19.868671 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77ee4087-b150-4fbe-837a-e7c99d43f065" containerName="storage-initializer" Apr 24 21:37:19.868988 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:19.868679 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ee4087-b150-4fbe-837a-e7c99d43f065" containerName="storage-initializer" Apr 24 21:37:19.868988 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:19.868735 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="77ee4087-b150-4fbe-837a-e7c99d43f065" containerName="main" Apr 24 21:37:19.868988 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:19.868744 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="77ee4087-b150-4fbe-837a-e7c99d43f065" containerName="tokenizer" Apr 24 21:37:19.874140 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:19.874116 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:19.877222 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:19.877173 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 24 21:37:19.877222 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:19.877181 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-pw5lq\"" Apr 24 21:37:19.881914 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:19.881891 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x"] Apr 24 21:37:19.940598 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:19.940563 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:19.940745 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:19.940629 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:19.940745 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:19.940684 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/244f6c7c-b602-49d1-9ecd-10604ed08606-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:19.940745 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:19.940711 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgvfk\" (UniqueName: \"kubernetes.io/projected/244f6c7c-b602-49d1-9ecd-10604ed08606-kube-api-access-rgvfk\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:19.940906 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:19.940766 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:19.940906 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:19.940853 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:20.041961 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:20.041933 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:20.042131 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:20.041979 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/244f6c7c-b602-49d1-9ecd-10604ed08606-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:20.042131 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:20.041998 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgvfk\" (UniqueName: \"kubernetes.io/projected/244f6c7c-b602-49d1-9ecd-10604ed08606-kube-api-access-rgvfk\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:20.042131 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:20.042038 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:20.042131 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:20.042063 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:20.042131 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:20.042096 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:20.042419 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:20.042381 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:20.042419 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:20.042403 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:20.042545 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:20.042499 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:20.042545 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:20.042514 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:20.044789 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:20.044745 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/244f6c7c-b602-49d1-9ecd-10604ed08606-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:20.050091 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:20.050070 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgvfk\" (UniqueName: \"kubernetes.io/projected/244f6c7c-b602-49d1-9ecd-10604ed08606-kube-api-access-rgvfk\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:20.185912 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:20.185881 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:20.525323 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:20.525296 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x"] Apr 24 21:37:20.527610 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:37:20.527582 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod244f6c7c_b602_49d1_9ecd_10604ed08606.slice/crio-c8ec38a7a3ae097093869bc84685d52ef246d7ed36fd438314957ff332659924 WatchSource:0}: Error finding container c8ec38a7a3ae097093869bc84685d52ef246d7ed36fd438314957ff332659924: Status 404 returned error can't find the container with id c8ec38a7a3ae097093869bc84685d52ef246d7ed36fd438314957ff332659924 Apr 24 21:37:20.634318 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:20.634279 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" event={"ID":"244f6c7c-b602-49d1-9ecd-10604ed08606","Type":"ContainerStarted","Data":"4ce09a281609657a500f384cd3514ec4b444167c33f98529d74151cddaddc1bf"} Apr 24 21:37:20.634318 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:20.634317 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" event={"ID":"244f6c7c-b602-49d1-9ecd-10604ed08606","Type":"ContainerStarted","Data":"c8ec38a7a3ae097093869bc84685d52ef246d7ed36fd438314957ff332659924"} Apr 24 21:37:21.639065 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:21.638975 2581 generic.go:358] "Generic (PLEG): container finished" podID="244f6c7c-b602-49d1-9ecd-10604ed08606" containerID="4ce09a281609657a500f384cd3514ec4b444167c33f98529d74151cddaddc1bf" exitCode=0 Apr 24 21:37:21.639065 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:21.639021 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" event={"ID":"244f6c7c-b602-49d1-9ecd-10604ed08606","Type":"ContainerDied","Data":"4ce09a281609657a500f384cd3514ec4b444167c33f98529d74151cddaddc1bf"} Apr 24 21:37:22.644305 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:22.644269 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" event={"ID":"244f6c7c-b602-49d1-9ecd-10604ed08606","Type":"ContainerStarted","Data":"8fbecc01769f8e28d2f8ade987be6bff09a3a3eba96abd5c568c6172d85420ee"} Apr 24 21:37:22.644305 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:22.644310 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" event={"ID":"244f6c7c-b602-49d1-9ecd-10604ed08606","Type":"ContainerStarted","Data":"df747ce7f2762e114ae604723bee403e713b54e66138d5e2ff0c2cf785060860"} Apr 24 21:37:22.644785 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:22.644485 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:22.667221 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:22.667175 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" podStartSLOduration=3.667161663 podStartE2EDuration="3.667161663s" podCreationTimestamp="2026-04-24 21:37:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:37:22.666100416 +0000 UTC m=+1283.344937100" watchObservedRunningTime="2026-04-24 21:37:22.667161663 +0000 UTC m=+1283.345998331" Apr 24 21:37:30.186439 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:30.186388 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:30.186439 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:30.186449 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:30.189190 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:30.189166 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:30.677767 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:30.677738 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:37:51.681153 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:37:51.681120 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:40:08.149276 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:08.149236 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x"] Apr 24 21:40:08.149873 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:08.149652 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" podUID="244f6c7c-b602-49d1-9ecd-10604ed08606" containerName="main" containerID="cri-o://df747ce7f2762e114ae604723bee403e713b54e66138d5e2ff0c2cf785060860" gracePeriod=30 Apr 24 21:40:08.149873 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:08.149689 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" podUID="244f6c7c-b602-49d1-9ecd-10604ed08606" containerName="tokenizer" containerID="cri-o://8fbecc01769f8e28d2f8ade987be6bff09a3a3eba96abd5c568c6172d85420ee" gracePeriod=30 Apr 24 21:40:09.222614 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.222578 2581 generic.go:358] "Generic (PLEG): container finished" podID="244f6c7c-b602-49d1-9ecd-10604ed08606" containerID="8fbecc01769f8e28d2f8ade987be6bff09a3a3eba96abd5c568c6172d85420ee" exitCode=0 Apr 24 21:40:09.222614 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.222610 2581 generic.go:358] "Generic (PLEG): container finished" podID="244f6c7c-b602-49d1-9ecd-10604ed08606" containerID="df747ce7f2762e114ae604723bee403e713b54e66138d5e2ff0c2cf785060860" exitCode=0 Apr 24 21:40:09.222987 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.222644 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" event={"ID":"244f6c7c-b602-49d1-9ecd-10604ed08606","Type":"ContainerDied","Data":"8fbecc01769f8e28d2f8ade987be6bff09a3a3eba96abd5c568c6172d85420ee"} Apr 24 21:40:09.222987 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.222681 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" event={"ID":"244f6c7c-b602-49d1-9ecd-10604ed08606","Type":"ContainerDied","Data":"df747ce7f2762e114ae604723bee403e713b54e66138d5e2ff0c2cf785060860"} Apr 24 21:40:09.327143 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.327119 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:40:09.457502 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.457475 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-tokenizer-uds\") pod \"244f6c7c-b602-49d1-9ecd-10604ed08606\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " Apr 24 21:40:09.457690 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.457511 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/244f6c7c-b602-49d1-9ecd-10604ed08606-tls-certs\") pod \"244f6c7c-b602-49d1-9ecd-10604ed08606\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " Apr 24 21:40:09.457690 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.457529 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-kserve-provision-location\") pod \"244f6c7c-b602-49d1-9ecd-10604ed08606\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " Apr 24 21:40:09.457690 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.457555 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-tokenizer-cache\") pod \"244f6c7c-b602-49d1-9ecd-10604ed08606\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " Apr 24 21:40:09.457690 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.457579 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgvfk\" (UniqueName: \"kubernetes.io/projected/244f6c7c-b602-49d1-9ecd-10604ed08606-kube-api-access-rgvfk\") pod \"244f6c7c-b602-49d1-9ecd-10604ed08606\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " Apr 24 21:40:09.457690 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.457647 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-tokenizer-tmp\") pod \"244f6c7c-b602-49d1-9ecd-10604ed08606\" (UID: \"244f6c7c-b602-49d1-9ecd-10604ed08606\") " Apr 24 21:40:09.457975 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.457766 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "244f6c7c-b602-49d1-9ecd-10604ed08606" (UID: "244f6c7c-b602-49d1-9ecd-10604ed08606"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:09.457975 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.457875 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "244f6c7c-b602-49d1-9ecd-10604ed08606" (UID: "244f6c7c-b602-49d1-9ecd-10604ed08606"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:09.458076 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.458005 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-tokenizer-cache\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:40:09.458076 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.458029 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-tokenizer-uds\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:40:09.458076 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.458037 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "244f6c7c-b602-49d1-9ecd-10604ed08606" (UID: "244f6c7c-b602-49d1-9ecd-10604ed08606"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:09.458280 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.458252 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "244f6c7c-b602-49d1-9ecd-10604ed08606" (UID: "244f6c7c-b602-49d1-9ecd-10604ed08606"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:09.459745 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.459717 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/244f6c7c-b602-49d1-9ecd-10604ed08606-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "244f6c7c-b602-49d1-9ecd-10604ed08606" (UID: "244f6c7c-b602-49d1-9ecd-10604ed08606"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:40:09.459855 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.459772 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/244f6c7c-b602-49d1-9ecd-10604ed08606-kube-api-access-rgvfk" (OuterVolumeSpecName: "kube-api-access-rgvfk") pod "244f6c7c-b602-49d1-9ecd-10604ed08606" (UID: "244f6c7c-b602-49d1-9ecd-10604ed08606"). InnerVolumeSpecName "kube-api-access-rgvfk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:40:09.558904 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.558870 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/244f6c7c-b602-49d1-9ecd-10604ed08606-tls-certs\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:40:09.558904 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.558901 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-kserve-provision-location\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:40:09.559093 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.558915 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rgvfk\" (UniqueName: \"kubernetes.io/projected/244f6c7c-b602-49d1-9ecd-10604ed08606-kube-api-access-rgvfk\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:40:09.559093 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:09.558928 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/244f6c7c-b602-49d1-9ecd-10604ed08606-tokenizer-tmp\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:40:10.227883 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:10.227856 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" Apr 24 21:40:10.228299 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:10.227848 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x" event={"ID":"244f6c7c-b602-49d1-9ecd-10604ed08606","Type":"ContainerDied","Data":"c8ec38a7a3ae097093869bc84685d52ef246d7ed36fd438314957ff332659924"} Apr 24 21:40:10.228299 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:10.227981 2581 scope.go:117] "RemoveContainer" containerID="8fbecc01769f8e28d2f8ade987be6bff09a3a3eba96abd5c568c6172d85420ee" Apr 24 21:40:10.236416 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:10.236377 2581 scope.go:117] "RemoveContainer" containerID="df747ce7f2762e114ae604723bee403e713b54e66138d5e2ff0c2cf785060860" Apr 24 21:40:10.243666 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:10.243649 2581 scope.go:117] "RemoveContainer" containerID="4ce09a281609657a500f384cd3514ec4b444167c33f98529d74151cddaddc1bf" Apr 24 21:40:10.247236 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:10.247211 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x"] Apr 24 21:40:10.251557 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:10.251533 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schebw84x"] Apr 24 21:40:11.874273 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:11.874231 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="244f6c7c-b602-49d1-9ecd-10604ed08606" path="/var/lib/kubelet/pods/244f6c7c-b602-49d1-9ecd-10604ed08606/volumes" Apr 24 21:40:26.979321 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:26.979279 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4"] Apr 24 21:40:26.979902 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:26.979876 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="244f6c7c-b602-49d1-9ecd-10604ed08606" containerName="main" Apr 24 21:40:26.979902 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:26.979899 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="244f6c7c-b602-49d1-9ecd-10604ed08606" containerName="main" Apr 24 21:40:26.980105 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:26.979917 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="244f6c7c-b602-49d1-9ecd-10604ed08606" containerName="tokenizer" Apr 24 21:40:26.980105 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:26.979926 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="244f6c7c-b602-49d1-9ecd-10604ed08606" containerName="tokenizer" Apr 24 21:40:26.980105 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:26.979951 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="244f6c7c-b602-49d1-9ecd-10604ed08606" containerName="storage-initializer" Apr 24 21:40:26.980105 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:26.979960 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="244f6c7c-b602-49d1-9ecd-10604ed08606" containerName="storage-initializer" Apr 24 21:40:26.980105 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:26.980071 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="244f6c7c-b602-49d1-9ecd-10604ed08606" containerName="main" Apr 24 21:40:26.980105 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:26.980084 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="244f6c7c-b602-49d1-9ecd-10604ed08606" containerName="tokenizer" Apr 24 21:40:26.983196 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:26.983177 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:26.988976 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:26.988953 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-gtfsm\"" Apr 24 21:40:26.989477 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:26.989461 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 24 21:40:27.001306 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:27.001279 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4"] Apr 24 21:40:27.108187 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:27.108147 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:27.108187 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:27.108192 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:27.108397 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:27.108266 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:27.108397 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:27.108285 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:27.108397 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:27.108306 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmdgk\" (UniqueName: \"kubernetes.io/projected/b6ba6b20-9402-4048-a45c-d9f1f48a0346-kube-api-access-kmdgk\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:27.108543 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:27.108448 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:27.209648 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:27.209610 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:27.209648 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:27.209647 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:27.209867 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:27.209668 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmdgk\" (UniqueName: \"kubernetes.io/projected/b6ba6b20-9402-4048-a45c-d9f1f48a0346-kube-api-access-kmdgk\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:27.209867 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:27.209731 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:27.209867 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:27.209759 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:27.209867 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:27.209793 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:27.210021 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:27.210007 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:27.210109 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:27.210089 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:27.210194 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:27.210172 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:27.210251 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:27.210181 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:27.212192 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:27.212176 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:27.221523 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:27.221490 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmdgk\" (UniqueName: \"kubernetes.io/projected/b6ba6b20-9402-4048-a45c-d9f1f48a0346-kube-api-access-kmdgk\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:27.292636 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:27.292562 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:27.431005 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:27.430839 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4"] Apr 24 21:40:27.433323 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:40:27.433292 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6ba6b20_9402_4048_a45c_d9f1f48a0346.slice/crio-5fb74dc453fc84ed6378e66d4938b66f442b2e358c83d49abc0bcbddede3caab WatchSource:0}: Error finding container 5fb74dc453fc84ed6378e66d4938b66f442b2e358c83d49abc0bcbddede3caab: Status 404 returned error can't find the container with id 5fb74dc453fc84ed6378e66d4938b66f442b2e358c83d49abc0bcbddede3caab Apr 24 21:40:27.435125 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:27.435107 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:40:28.293986 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:28.293955 2581 generic.go:358] "Generic (PLEG): container finished" podID="b6ba6b20-9402-4048-a45c-d9f1f48a0346" containerID="dc995fa59a334fe19190d93b3e229123395fb7f2100c1787b8a755bb688d00b5" exitCode=0 Apr 24 21:40:28.294298 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:28.294021 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" event={"ID":"b6ba6b20-9402-4048-a45c-d9f1f48a0346","Type":"ContainerDied","Data":"dc995fa59a334fe19190d93b3e229123395fb7f2100c1787b8a755bb688d00b5"} Apr 24 21:40:28.294298 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:28.294050 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" event={"ID":"b6ba6b20-9402-4048-a45c-d9f1f48a0346","Type":"ContainerStarted","Data":"5fb74dc453fc84ed6378e66d4938b66f442b2e358c83d49abc0bcbddede3caab"} Apr 24 21:40:29.299531 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:29.299502 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" event={"ID":"b6ba6b20-9402-4048-a45c-d9f1f48a0346","Type":"ContainerStarted","Data":"cbc62192406fa84a1119e1deb128ec99748caefdd0a82858f6d944cbe5057423"} Apr 24 21:40:29.299531 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:29.299537 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" event={"ID":"b6ba6b20-9402-4048-a45c-d9f1f48a0346","Type":"ContainerStarted","Data":"0004b535bc817ce76594e962e5752c5ed56193c6de8d68be87d92c03ff9a376b"} Apr 24 21:40:29.299937 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:29.299692 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:29.320208 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:29.320149 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" podStartSLOduration=3.320130076 podStartE2EDuration="3.320130076s" podCreationTimestamp="2026-04-24 21:40:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:40:29.317858306 +0000 UTC m=+1469.996694978" watchObservedRunningTime="2026-04-24 21:40:29.320130076 +0000 UTC m=+1469.998966749" Apr 24 21:40:37.292919 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:37.292882 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:37.293361 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:37.292932 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:37.299011 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:37.298985 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:37.330234 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:37.330212 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:58.334252 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:58.334220 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:40:59.899161 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:59.899135 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/1.log" Apr 24 21:40:59.900393 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:59.900372 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/1.log" Apr 24 21:40:59.904203 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:59.904180 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 21:40:59.905568 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:40:59.905548 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 21:45:16.820482 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:16.820444 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n"] Apr 24 21:45:16.820991 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:16.820851 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" podUID="4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06" containerName="main" containerID="cri-o://62dce807e5617280a73f7b4ddda959134154d8ad2ce3919749d0591551c10802" gracePeriod=30 Apr 24 21:45:16.820991 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:16.820915 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" podUID="4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06" containerName="tokenizer" containerID="cri-o://f314ed84ac76143fa0a77480c45fa5c66a905892a51bc5801b27b9e8594caf0b" gracePeriod=30 Apr 24 21:45:17.330041 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:17.330003 2581 generic.go:358] "Generic (PLEG): container finished" podID="4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06" containerID="62dce807e5617280a73f7b4ddda959134154d8ad2ce3919749d0591551c10802" exitCode=0 Apr 24 21:45:17.330219 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:17.330078 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" event={"ID":"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06","Type":"ContainerDied","Data":"62dce807e5617280a73f7b4ddda959134154d8ad2ce3919749d0591551c10802"} Apr 24 21:45:17.995524 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:17.995503 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:45:18.041667 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.041642 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-kserve-provision-location\") pod \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " Apr 24 21:45:18.041835 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.041689 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tokenizer-tmp\") pod \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " Apr 24 21:45:18.041835 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.041732 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wls45\" (UniqueName: \"kubernetes.io/projected/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-kube-api-access-wls45\") pod \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " Apr 24 21:45:18.041835 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.041762 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tokenizer-cache\") pod \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " Apr 24 21:45:18.041835 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.041788 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tokenizer-uds\") pod \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " Apr 24 21:45:18.042195 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.042167 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06" (UID: "4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:18.042300 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.042249 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06" (UID: "4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:18.042351 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.042319 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06" (UID: "4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:18.042678 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.042650 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06" (UID: "4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:18.043880 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.043860 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-kube-api-access-wls45" (OuterVolumeSpecName: "kube-api-access-wls45") pod "4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06" (UID: "4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06"). InnerVolumeSpecName "kube-api-access-wls45". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:45:18.142638 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.142573 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tls-certs\") pod \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\" (UID: \"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06\") " Apr 24 21:45:18.142742 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.142728 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-kserve-provision-location\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:45:18.142742 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.142739 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tokenizer-tmp\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:45:18.142813 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.142749 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wls45\" (UniqueName: \"kubernetes.io/projected/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-kube-api-access-wls45\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:45:18.142813 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.142759 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tokenizer-cache\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:45:18.142813 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.142768 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tokenizer-uds\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:45:18.144494 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.144472 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06" (UID: "4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:45:18.243746 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.243720 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06-tls-certs\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:45:18.335452 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.335409 2581 generic.go:358] "Generic (PLEG): container finished" podID="4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06" containerID="f314ed84ac76143fa0a77480c45fa5c66a905892a51bc5801b27b9e8594caf0b" exitCode=0 Apr 24 21:45:18.335591 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.335495 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" Apr 24 21:45:18.335591 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.335492 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" event={"ID":"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06","Type":"ContainerDied","Data":"f314ed84ac76143fa0a77480c45fa5c66a905892a51bc5801b27b9e8594caf0b"} Apr 24 21:45:18.335665 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.335607 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n" event={"ID":"4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06","Type":"ContainerDied","Data":"14a8e7db35e8f8a750822d1931f6c1523b3dbfbec6b71920b2d3ffdc6242b9c2"} Apr 24 21:45:18.335665 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.335624 2581 scope.go:117] "RemoveContainer" containerID="f314ed84ac76143fa0a77480c45fa5c66a905892a51bc5801b27b9e8594caf0b" Apr 24 21:45:18.345247 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.345230 2581 scope.go:117] "RemoveContainer" containerID="62dce807e5617280a73f7b4ddda959134154d8ad2ce3919749d0591551c10802" Apr 24 21:45:18.352509 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.352405 2581 scope.go:117] "RemoveContainer" containerID="a7f830888127a1bebc6e88437ea0c8fcd41307f2ccf6aded42e72331e64321e8" Apr 24 21:45:18.358641 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.358616 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n"] Apr 24 21:45:18.359937 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.359922 2581 scope.go:117] "RemoveContainer" containerID="f314ed84ac76143fa0a77480c45fa5c66a905892a51bc5801b27b9e8594caf0b" Apr 24 21:45:18.360163 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:45:18.360147 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f314ed84ac76143fa0a77480c45fa5c66a905892a51bc5801b27b9e8594caf0b\": container with ID starting with f314ed84ac76143fa0a77480c45fa5c66a905892a51bc5801b27b9e8594caf0b not found: ID does not exist" containerID="f314ed84ac76143fa0a77480c45fa5c66a905892a51bc5801b27b9e8594caf0b" Apr 24 21:45:18.360223 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.360170 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f314ed84ac76143fa0a77480c45fa5c66a905892a51bc5801b27b9e8594caf0b"} err="failed to get container status \"f314ed84ac76143fa0a77480c45fa5c66a905892a51bc5801b27b9e8594caf0b\": rpc error: code = NotFound desc = could not find container \"f314ed84ac76143fa0a77480c45fa5c66a905892a51bc5801b27b9e8594caf0b\": container with ID starting with f314ed84ac76143fa0a77480c45fa5c66a905892a51bc5801b27b9e8594caf0b not found: ID does not exist" Apr 24 21:45:18.360223 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.360191 2581 scope.go:117] "RemoveContainer" containerID="62dce807e5617280a73f7b4ddda959134154d8ad2ce3919749d0591551c10802" Apr 24 21:45:18.360473 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:45:18.360406 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62dce807e5617280a73f7b4ddda959134154d8ad2ce3919749d0591551c10802\": container with ID starting with 62dce807e5617280a73f7b4ddda959134154d8ad2ce3919749d0591551c10802 not found: ID does not exist" containerID="62dce807e5617280a73f7b4ddda959134154d8ad2ce3919749d0591551c10802" Apr 24 21:45:18.360546 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.360490 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62dce807e5617280a73f7b4ddda959134154d8ad2ce3919749d0591551c10802"} err="failed to get container status \"62dce807e5617280a73f7b4ddda959134154d8ad2ce3919749d0591551c10802\": rpc error: code = NotFound desc = could not find container \"62dce807e5617280a73f7b4ddda959134154d8ad2ce3919749d0591551c10802\": container with ID starting with 62dce807e5617280a73f7b4ddda959134154d8ad2ce3919749d0591551c10802 not found: ID does not exist" Apr 24 21:45:18.360546 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.360511 2581 scope.go:117] "RemoveContainer" containerID="a7f830888127a1bebc6e88437ea0c8fcd41307f2ccf6aded42e72331e64321e8" Apr 24 21:45:18.360812 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:45:18.360794 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7f830888127a1bebc6e88437ea0c8fcd41307f2ccf6aded42e72331e64321e8\": container with ID starting with a7f830888127a1bebc6e88437ea0c8fcd41307f2ccf6aded42e72331e64321e8 not found: ID does not exist" containerID="a7f830888127a1bebc6e88437ea0c8fcd41307f2ccf6aded42e72331e64321e8" Apr 24 21:45:18.360880 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.360820 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7f830888127a1bebc6e88437ea0c8fcd41307f2ccf6aded42e72331e64321e8"} err="failed to get container status \"a7f830888127a1bebc6e88437ea0c8fcd41307f2ccf6aded42e72331e64321e8\": rpc error: code = NotFound desc = could not find container \"a7f830888127a1bebc6e88437ea0c8fcd41307f2ccf6aded42e72331e64321e8\": container with ID starting with a7f830888127a1bebc6e88437ea0c8fcd41307f2ccf6aded42e72331e64321e8 not found: ID does not exist" Apr 24 21:45:18.365155 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:18.365138 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6f97ccbtkm2n"] Apr 24 21:45:19.873538 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:19.873508 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06" path="/var/lib/kubelet/pods/4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06/volumes" Apr 24 21:45:37.111776 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.111740 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv"] Apr 24 21:45:37.112829 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.112796 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06" containerName="main" Apr 24 21:45:37.112829 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.112829 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06" containerName="main" Apr 24 21:45:37.113034 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.112856 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06" containerName="storage-initializer" Apr 24 21:45:37.113034 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.112866 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06" containerName="storage-initializer" Apr 24 21:45:37.113034 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.112890 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06" containerName="tokenizer" Apr 24 21:45:37.113034 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.112899 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06" containerName="tokenizer" Apr 24 21:45:37.113034 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.113000 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06" containerName="main" Apr 24 21:45:37.113034 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.113014 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f0aa52b-f02b-45fe-bd05-ca6bb9ddcf06" containerName="tokenizer" Apr 24 21:45:37.116774 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.116751 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:37.119200 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.119176 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-lk96r\"" Apr 24 21:45:37.119335 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.119203 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 24 21:45:37.126056 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.126029 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv"] Apr 24 21:45:37.201946 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.201913 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r7tm\" (UniqueName: \"kubernetes.io/projected/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-kube-api-access-8r7tm\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:37.202125 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.201963 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:37.202125 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.201985 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:37.202125 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.202029 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:37.202125 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.202054 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:37.202125 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.202089 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:37.302524 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.302490 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:37.302697 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.302542 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:37.302697 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.302586 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:37.302697 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.302627 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:37.302697 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.302665 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:37.302897 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.302718 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8r7tm\" (UniqueName: \"kubernetes.io/projected/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-kube-api-access-8r7tm\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:37.302941 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.302893 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:37.302984 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.302957 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:37.303076 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.303053 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:37.303124 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.303071 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:37.305007 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.304986 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:37.310219 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.310182 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r7tm\" (UniqueName: \"kubernetes.io/projected/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-kube-api-access-8r7tm\") pod \"precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:37.428443 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.428390 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:37.556331 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.556257 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv"] Apr 24 21:45:37.559125 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:45:37.559092 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b78f077_5e8d_4a33_a8c2_98d82b4b8728.slice/crio-9907ea4d79f9e39493751a1c9db4bf47c7efb1daad549e054ccee8f70be2103b WatchSource:0}: Error finding container 9907ea4d79f9e39493751a1c9db4bf47c7efb1daad549e054ccee8f70be2103b: Status 404 returned error can't find the container with id 9907ea4d79f9e39493751a1c9db4bf47c7efb1daad549e054ccee8f70be2103b Apr 24 21:45:37.560990 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:37.560972 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:45:38.407709 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:38.407681 2581 generic.go:358] "Generic (PLEG): container finished" podID="1b78f077-5e8d-4a33-a8c2-98d82b4b8728" containerID="b31ea85874f925bcd3062d38562300be25a5a7af8cfe642ef191234c2c548d63" exitCode=0 Apr 24 21:45:38.408069 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:38.407732 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" event={"ID":"1b78f077-5e8d-4a33-a8c2-98d82b4b8728","Type":"ContainerDied","Data":"b31ea85874f925bcd3062d38562300be25a5a7af8cfe642ef191234c2c548d63"} Apr 24 21:45:38.408069 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:38.407783 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" event={"ID":"1b78f077-5e8d-4a33-a8c2-98d82b4b8728","Type":"ContainerStarted","Data":"9907ea4d79f9e39493751a1c9db4bf47c7efb1daad549e054ccee8f70be2103b"} Apr 24 21:45:39.415026 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:39.414994 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" event={"ID":"1b78f077-5e8d-4a33-a8c2-98d82b4b8728","Type":"ContainerStarted","Data":"cfd6aad0324e2d92f50e3f1a060fdca714c4eee02bbbe6276bed952ab1948d8c"} Apr 24 21:45:39.415026 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:39.415030 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" event={"ID":"1b78f077-5e8d-4a33-a8c2-98d82b4b8728","Type":"ContainerStarted","Data":"126e2b780f6aaceda963339fbabab2719927d20cba9cc13ce1a3823e93e13f90"} Apr 24 21:45:39.415480 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:39.415096 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:39.438460 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:39.438388 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" podStartSLOduration=2.438369521 podStartE2EDuration="2.438369521s" podCreationTimestamp="2026-04-24 21:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:45:39.43472642 +0000 UTC m=+1780.113563091" watchObservedRunningTime="2026-04-24 21:45:39.438369521 +0000 UTC m=+1780.117206195" Apr 24 21:45:47.428894 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:47.428854 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:47.429361 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:47.428907 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:47.430212 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:45:47.430189 2581 logging.go:55] [core] [Channel #1112 SubChannel #1113]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.48:9003", ServerName: "10.132.0.48:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.48:9003: connect: connection refused" Apr 24 21:45:47.431497 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:47.431469 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:47.448394 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:47.448367 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:45:48.429847 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:48.429804 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" podUID="1b78f077-5e8d-4a33-a8c2-98d82b4b8728" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.48:9003\" within 1s: context deadline exceeded" Apr 24 21:45:57.429355 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:45:57.429326 2581 logging.go:55] [core] [Channel #1120 SubChannel #1121]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.48:9003", ServerName: "10.132.0.48:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.48:9003: connect: connection refused" Apr 24 21:45:58.429487 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:58.429417 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" podUID="1b78f077-5e8d-4a33-a8c2-98d82b4b8728" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.48:9003\" within 1s: context deadline exceeded" Apr 24 21:45:59.922811 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:59.922781 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/1.log" Apr 24 21:45:59.926435 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:59.926398 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/1.log" Apr 24 21:45:59.928130 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:59.928111 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 21:45:59.931077 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:45:59.931057 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 21:46:08.452800 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:08.452770 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:46:09.687736 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:09.687703 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv"] Apr 24 21:46:09.688226 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:09.687986 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" podUID="1b78f077-5e8d-4a33-a8c2-98d82b4b8728" containerName="main" containerID="cri-o://126e2b780f6aaceda963339fbabab2719927d20cba9cc13ce1a3823e93e13f90" gracePeriod=30 Apr 24 21:46:09.688226 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:09.688022 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" podUID="1b78f077-5e8d-4a33-a8c2-98d82b4b8728" containerName="tokenizer" containerID="cri-o://cfd6aad0324e2d92f50e3f1a060fdca714c4eee02bbbe6276bed952ab1948d8c" gracePeriod=30 Apr 24 21:46:10.533034 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:10.533003 2581 generic.go:358] "Generic (PLEG): container finished" podID="1b78f077-5e8d-4a33-a8c2-98d82b4b8728" containerID="126e2b780f6aaceda963339fbabab2719927d20cba9cc13ce1a3823e93e13f90" exitCode=0 Apr 24 21:46:10.533205 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:10.533077 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" event={"ID":"1b78f077-5e8d-4a33-a8c2-98d82b4b8728","Type":"ContainerDied","Data":"126e2b780f6aaceda963339fbabab2719927d20cba9cc13ce1a3823e93e13f90"} Apr 24 21:46:11.159168 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.159144 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:46:11.201087 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.201062 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r7tm\" (UniqueName: \"kubernetes.io/projected/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-kube-api-access-8r7tm\") pod \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " Apr 24 21:46:11.201259 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.201092 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tokenizer-tmp\") pod \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " Apr 24 21:46:11.201259 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.201117 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tls-certs\") pod \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " Apr 24 21:46:11.201259 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.201159 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tokenizer-cache\") pod \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " Apr 24 21:46:11.201259 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.201211 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tokenizer-uds\") pod \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " Apr 24 21:46:11.201514 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.201261 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-kserve-provision-location\") pod \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\" (UID: \"1b78f077-5e8d-4a33-a8c2-98d82b4b8728\") " Apr 24 21:46:11.201514 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.201495 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "1b78f077-5e8d-4a33-a8c2-98d82b4b8728" (UID: "1b78f077-5e8d-4a33-a8c2-98d82b4b8728"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:11.201784 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.201760 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "1b78f077-5e8d-4a33-a8c2-98d82b4b8728" (UID: "1b78f077-5e8d-4a33-a8c2-98d82b4b8728"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:11.201885 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.201792 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "1b78f077-5e8d-4a33-a8c2-98d82b4b8728" (UID: "1b78f077-5e8d-4a33-a8c2-98d82b4b8728"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:11.202294 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.202259 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1b78f077-5e8d-4a33-a8c2-98d82b4b8728" (UID: "1b78f077-5e8d-4a33-a8c2-98d82b4b8728"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:11.203464 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.203433 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1b78f077-5e8d-4a33-a8c2-98d82b4b8728" (UID: "1b78f077-5e8d-4a33-a8c2-98d82b4b8728"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:46:11.203763 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.203745 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-kube-api-access-8r7tm" (OuterVolumeSpecName: "kube-api-access-8r7tm") pod "1b78f077-5e8d-4a33-a8c2-98d82b4b8728" (UID: "1b78f077-5e8d-4a33-a8c2-98d82b4b8728"). InnerVolumeSpecName "kube-api-access-8r7tm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:46:11.302410 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.302381 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tokenizer-cache\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:46:11.302410 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.302406 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tokenizer-uds\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:46:11.302410 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.302417 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-kserve-provision-location\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:46:11.302642 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.302445 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8r7tm\" (UniqueName: \"kubernetes.io/projected/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-kube-api-access-8r7tm\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:46:11.302642 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.302454 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tokenizer-tmp\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:46:11.302642 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.302463 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b78f077-5e8d-4a33-a8c2-98d82b4b8728-tls-certs\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:46:11.537940 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.537857 2581 generic.go:358] "Generic (PLEG): container finished" podID="1b78f077-5e8d-4a33-a8c2-98d82b4b8728" containerID="cfd6aad0324e2d92f50e3f1a060fdca714c4eee02bbbe6276bed952ab1948d8c" exitCode=0 Apr 24 21:46:11.537940 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.537933 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" Apr 24 21:46:11.538136 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.537936 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" event={"ID":"1b78f077-5e8d-4a33-a8c2-98d82b4b8728","Type":"ContainerDied","Data":"cfd6aad0324e2d92f50e3f1a060fdca714c4eee02bbbe6276bed952ab1948d8c"} Apr 24 21:46:11.538136 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.537977 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv" event={"ID":"1b78f077-5e8d-4a33-a8c2-98d82b4b8728","Type":"ContainerDied","Data":"9907ea4d79f9e39493751a1c9db4bf47c7efb1daad549e054ccee8f70be2103b"} Apr 24 21:46:11.538136 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.537993 2581 scope.go:117] "RemoveContainer" containerID="cfd6aad0324e2d92f50e3f1a060fdca714c4eee02bbbe6276bed952ab1948d8c" Apr 24 21:46:11.547595 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.547574 2581 scope.go:117] "RemoveContainer" containerID="126e2b780f6aaceda963339fbabab2719927d20cba9cc13ce1a3823e93e13f90" Apr 24 21:46:11.555060 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.555039 2581 scope.go:117] "RemoveContainer" containerID="b31ea85874f925bcd3062d38562300be25a5a7af8cfe642ef191234c2c548d63" Apr 24 21:46:11.560573 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.560552 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv"] Apr 24 21:46:11.562482 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.562465 2581 scope.go:117] "RemoveContainer" containerID="cfd6aad0324e2d92f50e3f1a060fdca714c4eee02bbbe6276bed952ab1948d8c" Apr 24 21:46:11.562772 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:46:11.562752 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfd6aad0324e2d92f50e3f1a060fdca714c4eee02bbbe6276bed952ab1948d8c\": container with ID starting with cfd6aad0324e2d92f50e3f1a060fdca714c4eee02bbbe6276bed952ab1948d8c not found: ID does not exist" containerID="cfd6aad0324e2d92f50e3f1a060fdca714c4eee02bbbe6276bed952ab1948d8c" Apr 24 21:46:11.562839 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.562784 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfd6aad0324e2d92f50e3f1a060fdca714c4eee02bbbe6276bed952ab1948d8c"} err="failed to get container status \"cfd6aad0324e2d92f50e3f1a060fdca714c4eee02bbbe6276bed952ab1948d8c\": rpc error: code = NotFound desc = could not find container \"cfd6aad0324e2d92f50e3f1a060fdca714c4eee02bbbe6276bed952ab1948d8c\": container with ID starting with cfd6aad0324e2d92f50e3f1a060fdca714c4eee02bbbe6276bed952ab1948d8c not found: ID does not exist" Apr 24 21:46:11.562839 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.562808 2581 scope.go:117] "RemoveContainer" containerID="126e2b780f6aaceda963339fbabab2719927d20cba9cc13ce1a3823e93e13f90" Apr 24 21:46:11.563096 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:46:11.563066 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"126e2b780f6aaceda963339fbabab2719927d20cba9cc13ce1a3823e93e13f90\": container with ID starting with 126e2b780f6aaceda963339fbabab2719927d20cba9cc13ce1a3823e93e13f90 not found: ID does not exist" containerID="126e2b780f6aaceda963339fbabab2719927d20cba9cc13ce1a3823e93e13f90" Apr 24 21:46:11.563241 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.563110 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"126e2b780f6aaceda963339fbabab2719927d20cba9cc13ce1a3823e93e13f90"} err="failed to get container status \"126e2b780f6aaceda963339fbabab2719927d20cba9cc13ce1a3823e93e13f90\": rpc error: code = NotFound desc = could not find container \"126e2b780f6aaceda963339fbabab2719927d20cba9cc13ce1a3823e93e13f90\": container with ID starting with 126e2b780f6aaceda963339fbabab2719927d20cba9cc13ce1a3823e93e13f90 not found: ID does not exist" Apr 24 21:46:11.563373 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.563356 2581 scope.go:117] "RemoveContainer" containerID="b31ea85874f925bcd3062d38562300be25a5a7af8cfe642ef191234c2c548d63" Apr 24 21:46:11.563849 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:46:11.563822 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b31ea85874f925bcd3062d38562300be25a5a7af8cfe642ef191234c2c548d63\": container with ID starting with b31ea85874f925bcd3062d38562300be25a5a7af8cfe642ef191234c2c548d63 not found: ID does not exist" containerID="b31ea85874f925bcd3062d38562300be25a5a7af8cfe642ef191234c2c548d63" Apr 24 21:46:11.563933 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.563856 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b31ea85874f925bcd3062d38562300be25a5a7af8cfe642ef191234c2c548d63"} err="failed to get container status \"b31ea85874f925bcd3062d38562300be25a5a7af8cfe642ef191234c2c548d63\": rpc error: code = NotFound desc = could not find container \"b31ea85874f925bcd3062d38562300be25a5a7af8cfe642ef191234c2c548d63\": container with ID starting with b31ea85874f925bcd3062d38562300be25a5a7af8cfe642ef191234c2c548d63 not found: ID does not exist" Apr 24 21:46:11.565228 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.565207 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-cfddc87b7lzvv"] Apr 24 21:46:11.874464 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:11.874363 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b78f077-5e8d-4a33-a8c2-98d82b4b8728" path="/var/lib/kubelet/pods/1b78f077-5e8d-4a33-a8c2-98d82b4b8728/volumes" Apr 24 21:46:22.400711 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.400673 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd"] Apr 24 21:46:22.401537 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.401514 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b78f077-5e8d-4a33-a8c2-98d82b4b8728" containerName="main" Apr 24 21:46:22.401537 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.401537 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b78f077-5e8d-4a33-a8c2-98d82b4b8728" containerName="main" Apr 24 21:46:22.401700 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.401551 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b78f077-5e8d-4a33-a8c2-98d82b4b8728" containerName="tokenizer" Apr 24 21:46:22.401700 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.401558 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b78f077-5e8d-4a33-a8c2-98d82b4b8728" containerName="tokenizer" Apr 24 21:46:22.401700 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.401576 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b78f077-5e8d-4a33-a8c2-98d82b4b8728" containerName="storage-initializer" Apr 24 21:46:22.401700 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.401582 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b78f077-5e8d-4a33-a8c2-98d82b4b8728" containerName="storage-initializer" Apr 24 21:46:22.401700 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.401669 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b78f077-5e8d-4a33-a8c2-98d82b4b8728" containerName="tokenizer" Apr 24 21:46:22.401700 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.401686 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b78f077-5e8d-4a33-a8c2-98d82b4b8728" containerName="main" Apr 24 21:46:22.406309 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.406289 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:22.408701 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.408667 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-z445r\"" Apr 24 21:46:22.408829 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.408775 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 24 21:46:22.415238 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.415209 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd"] Apr 24 21:46:22.491327 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.491295 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:22.491507 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.491345 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:22.491507 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.491480 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:22.491600 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.491513 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bk6c\" (UniqueName: \"kubernetes.io/projected/d976b1e2-4421-4187-bd4f-0f713f08c1b9-kube-api-access-5bk6c\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:22.491600 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.491533 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:22.491600 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.491562 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:22.592741 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.592703 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:22.592923 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.592758 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bk6c\" (UniqueName: \"kubernetes.io/projected/d976b1e2-4421-4187-bd4f-0f713f08c1b9-kube-api-access-5bk6c\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:22.592923 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.592794 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:22.592923 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.592827 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:22.592923 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.592866 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:22.593164 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.593013 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:22.593164 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.593117 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:22.593274 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.593242 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:22.593331 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.593283 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:22.593331 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.593307 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:22.595701 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.595678 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:22.600876 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.600855 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bk6c\" (UniqueName: \"kubernetes.io/projected/d976b1e2-4421-4187-bd4f-0f713f08c1b9-kube-api-access-5bk6c\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:22.718046 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.718008 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:22.851872 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:22.851846 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd"] Apr 24 21:46:22.853903 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:46:22.853869 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd976b1e2_4421_4187_bd4f_0f713f08c1b9.slice/crio-35cc38c1f30835f1923bd445e9a328ff0571bbc63c4307f47c0d0db55a84f18c WatchSource:0}: Error finding container 35cc38c1f30835f1923bd445e9a328ff0571bbc63c4307f47c0d0db55a84f18c: Status 404 returned error can't find the container with id 35cc38c1f30835f1923bd445e9a328ff0571bbc63c4307f47c0d0db55a84f18c Apr 24 21:46:23.585621 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:23.585581 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" event={"ID":"d976b1e2-4421-4187-bd4f-0f713f08c1b9","Type":"ContainerStarted","Data":"fd3b151622ced259d42e85342ef4581b0dfe0876587d8733761943b40ce1e6e8"} Apr 24 21:46:23.585621 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:23.585620 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" event={"ID":"d976b1e2-4421-4187-bd4f-0f713f08c1b9","Type":"ContainerStarted","Data":"35cc38c1f30835f1923bd445e9a328ff0571bbc63c4307f47c0d0db55a84f18c"} Apr 24 21:46:24.590968 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:24.590933 2581 generic.go:358] "Generic (PLEG): container finished" podID="d976b1e2-4421-4187-bd4f-0f713f08c1b9" containerID="fd3b151622ced259d42e85342ef4581b0dfe0876587d8733761943b40ce1e6e8" exitCode=0 Apr 24 21:46:24.591342 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:24.590990 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" event={"ID":"d976b1e2-4421-4187-bd4f-0f713f08c1b9","Type":"ContainerDied","Data":"fd3b151622ced259d42e85342ef4581b0dfe0876587d8733761943b40ce1e6e8"} Apr 24 21:46:25.599874 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:25.599833 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" event={"ID":"d976b1e2-4421-4187-bd4f-0f713f08c1b9","Type":"ContainerStarted","Data":"08ccbf697af8acf9f4d6e9cc60d958ffd3b468820b1f5c9d95318fc69e9a7348"} Apr 24 21:46:25.599874 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:25.599881 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" event={"ID":"d976b1e2-4421-4187-bd4f-0f713f08c1b9","Type":"ContainerStarted","Data":"6510e80e9f132930e1f55234fc78d94df73c7aa72db63a9e447e23e99503f356"} Apr 24 21:46:25.600308 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:25.600007 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:25.619962 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:25.619920 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" podStartSLOduration=3.619905234 podStartE2EDuration="3.619905234s" podCreationTimestamp="2026-04-24 21:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:46:25.618239064 +0000 UTC m=+1826.297075727" watchObservedRunningTime="2026-04-24 21:46:25.619905234 +0000 UTC m=+1826.298741906" Apr 24 21:46:32.718403 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:32.718367 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:32.718403 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:32.718409 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:32.720955 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:32.720929 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:33.633585 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:33.633552 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:46:54.636537 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:46:54.636506 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:49:23.957325 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:23.957290 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd"] Apr 24 21:49:23.959935 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:23.957713 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" podUID="d976b1e2-4421-4187-bd4f-0f713f08c1b9" containerName="main" containerID="cri-o://6510e80e9f132930e1f55234fc78d94df73c7aa72db63a9e447e23e99503f356" gracePeriod=30 Apr 24 21:49:23.959935 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:23.957761 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" podUID="d976b1e2-4421-4187-bd4f-0f713f08c1b9" containerName="tokenizer" containerID="cri-o://08ccbf697af8acf9f4d6e9cc60d958ffd3b468820b1f5c9d95318fc69e9a7348" gracePeriod=30 Apr 24 21:49:24.227581 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:24.227491 2581 generic.go:358] "Generic (PLEG): container finished" podID="d976b1e2-4421-4187-bd4f-0f713f08c1b9" containerID="6510e80e9f132930e1f55234fc78d94df73c7aa72db63a9e447e23e99503f356" exitCode=0 Apr 24 21:49:24.227581 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:24.227550 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" event={"ID":"d976b1e2-4421-4187-bd4f-0f713f08c1b9","Type":"ContainerDied","Data":"6510e80e9f132930e1f55234fc78d94df73c7aa72db63a9e447e23e99503f356"} Apr 24 21:49:24.633506 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:49:24.633398 2581 logging.go:55] [core] [Channel #1350 SubChannel #1351]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.49:9003", ServerName: "10.132.0.49:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.49:9003: connect: connection refused" Apr 24 21:49:25.216998 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.216975 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:49:25.232652 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.232621 2581 generic.go:358] "Generic (PLEG): container finished" podID="d976b1e2-4421-4187-bd4f-0f713f08c1b9" containerID="08ccbf697af8acf9f4d6e9cc60d958ffd3b468820b1f5c9d95318fc69e9a7348" exitCode=0 Apr 24 21:49:25.232779 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.232691 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" Apr 24 21:49:25.232779 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.232696 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" event={"ID":"d976b1e2-4421-4187-bd4f-0f713f08c1b9","Type":"ContainerDied","Data":"08ccbf697af8acf9f4d6e9cc60d958ffd3b468820b1f5c9d95318fc69e9a7348"} Apr 24 21:49:25.232779 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.232732 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" event={"ID":"d976b1e2-4421-4187-bd4f-0f713f08c1b9","Type":"ContainerDied","Data":"35cc38c1f30835f1923bd445e9a328ff0571bbc63c4307f47c0d0db55a84f18c"} Apr 24 21:49:25.232779 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.232748 2581 scope.go:117] "RemoveContainer" containerID="08ccbf697af8acf9f4d6e9cc60d958ffd3b468820b1f5c9d95318fc69e9a7348" Apr 24 21:49:25.241406 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.241252 2581 scope.go:117] "RemoveContainer" containerID="6510e80e9f132930e1f55234fc78d94df73c7aa72db63a9e447e23e99503f356" Apr 24 21:49:25.251179 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.251151 2581 scope.go:117] "RemoveContainer" containerID="fd3b151622ced259d42e85342ef4581b0dfe0876587d8733761943b40ce1e6e8" Apr 24 21:49:25.259363 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.259344 2581 scope.go:117] "RemoveContainer" containerID="08ccbf697af8acf9f4d6e9cc60d958ffd3b468820b1f5c9d95318fc69e9a7348" Apr 24 21:49:25.259703 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:49:25.259649 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08ccbf697af8acf9f4d6e9cc60d958ffd3b468820b1f5c9d95318fc69e9a7348\": container with ID starting with 08ccbf697af8acf9f4d6e9cc60d958ffd3b468820b1f5c9d95318fc69e9a7348 not found: ID does not exist" containerID="08ccbf697af8acf9f4d6e9cc60d958ffd3b468820b1f5c9d95318fc69e9a7348" Apr 24 21:49:25.259703 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.259687 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ccbf697af8acf9f4d6e9cc60d958ffd3b468820b1f5c9d95318fc69e9a7348"} err="failed to get container status \"08ccbf697af8acf9f4d6e9cc60d958ffd3b468820b1f5c9d95318fc69e9a7348\": rpc error: code = NotFound desc = could not find container \"08ccbf697af8acf9f4d6e9cc60d958ffd3b468820b1f5c9d95318fc69e9a7348\": container with ID starting with 08ccbf697af8acf9f4d6e9cc60d958ffd3b468820b1f5c9d95318fc69e9a7348 not found: ID does not exist" Apr 24 21:49:25.259833 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.259712 2581 scope.go:117] "RemoveContainer" containerID="6510e80e9f132930e1f55234fc78d94df73c7aa72db63a9e447e23e99503f356" Apr 24 21:49:25.259970 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:49:25.259940 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6510e80e9f132930e1f55234fc78d94df73c7aa72db63a9e447e23e99503f356\": container with ID starting with 6510e80e9f132930e1f55234fc78d94df73c7aa72db63a9e447e23e99503f356 not found: ID does not exist" containerID="6510e80e9f132930e1f55234fc78d94df73c7aa72db63a9e447e23e99503f356" Apr 24 21:49:25.260007 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.259981 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6510e80e9f132930e1f55234fc78d94df73c7aa72db63a9e447e23e99503f356"} err="failed to get container status \"6510e80e9f132930e1f55234fc78d94df73c7aa72db63a9e447e23e99503f356\": rpc error: code = NotFound desc = could not find container \"6510e80e9f132930e1f55234fc78d94df73c7aa72db63a9e447e23e99503f356\": container with ID starting with 6510e80e9f132930e1f55234fc78d94df73c7aa72db63a9e447e23e99503f356 not found: ID does not exist" Apr 24 21:49:25.260045 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.260003 2581 scope.go:117] "RemoveContainer" containerID="fd3b151622ced259d42e85342ef4581b0dfe0876587d8733761943b40ce1e6e8" Apr 24 21:49:25.260262 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:49:25.260239 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd3b151622ced259d42e85342ef4581b0dfe0876587d8733761943b40ce1e6e8\": container with ID starting with fd3b151622ced259d42e85342ef4581b0dfe0876587d8733761943b40ce1e6e8 not found: ID does not exist" containerID="fd3b151622ced259d42e85342ef4581b0dfe0876587d8733761943b40ce1e6e8" Apr 24 21:49:25.260312 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.260272 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd3b151622ced259d42e85342ef4581b0dfe0876587d8733761943b40ce1e6e8"} err="failed to get container status \"fd3b151622ced259d42e85342ef4581b0dfe0876587d8733761943b40ce1e6e8\": rpc error: code = NotFound desc = could not find container \"fd3b151622ced259d42e85342ef4581b0dfe0876587d8733761943b40ce1e6e8\": container with ID starting with fd3b151622ced259d42e85342ef4581b0dfe0876587d8733761943b40ce1e6e8 not found: ID does not exist" Apr 24 21:49:25.345984 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.345949 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bk6c\" (UniqueName: \"kubernetes.io/projected/d976b1e2-4421-4187-bd4f-0f713f08c1b9-kube-api-access-5bk6c\") pod \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " Apr 24 21:49:25.346170 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.346002 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tls-certs\") pod \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " Apr 24 21:49:25.346170 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.346056 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-kserve-provision-location\") pod \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " Apr 24 21:49:25.346170 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.346137 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tokenizer-uds\") pod \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " Apr 24 21:49:25.346349 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.346194 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tokenizer-cache\") pod \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " Apr 24 21:49:25.346349 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.346236 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tokenizer-tmp\") pod \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\" (UID: \"d976b1e2-4421-4187-bd4f-0f713f08c1b9\") " Apr 24 21:49:25.346484 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.346407 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "d976b1e2-4421-4187-bd4f-0f713f08c1b9" (UID: "d976b1e2-4421-4187-bd4f-0f713f08c1b9"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:49:25.346641 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.346596 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tokenizer-uds\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:49:25.346641 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.346580 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "d976b1e2-4421-4187-bd4f-0f713f08c1b9" (UID: "d976b1e2-4421-4187-bd4f-0f713f08c1b9"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:49:25.346824 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.346727 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "d976b1e2-4421-4187-bd4f-0f713f08c1b9" (UID: "d976b1e2-4421-4187-bd4f-0f713f08c1b9"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:49:25.346936 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.346913 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d976b1e2-4421-4187-bd4f-0f713f08c1b9" (UID: "d976b1e2-4421-4187-bd4f-0f713f08c1b9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:49:25.348641 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.348621 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d976b1e2-4421-4187-bd4f-0f713f08c1b9-kube-api-access-5bk6c" (OuterVolumeSpecName: "kube-api-access-5bk6c") pod "d976b1e2-4421-4187-bd4f-0f713f08c1b9" (UID: "d976b1e2-4421-4187-bd4f-0f713f08c1b9"). InnerVolumeSpecName "kube-api-access-5bk6c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:49:25.348641 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.348626 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d976b1e2-4421-4187-bd4f-0f713f08c1b9" (UID: "d976b1e2-4421-4187-bd4f-0f713f08c1b9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:49:25.447560 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.447529 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5bk6c\" (UniqueName: \"kubernetes.io/projected/d976b1e2-4421-4187-bd4f-0f713f08c1b9-kube-api-access-5bk6c\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:49:25.447560 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.447553 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tls-certs\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:49:25.447560 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.447563 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-kserve-provision-location\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:49:25.447803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.447574 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tokenizer-cache\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:49:25.447803 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.447583 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d976b1e2-4421-4187-bd4f-0f713f08c1b9-tokenizer-tmp\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:49:25.554717 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.554684 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd"] Apr 24 21:49:25.558837 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.558815 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd"] Apr 24 21:49:25.633810 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.633776 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-djbwd" podUID="d976b1e2-4421-4187-bd4f-0f713f08c1b9" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.49:9003\" within 1s: context deadline exceeded" Apr 24 21:49:25.633958 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:49:25.633860 2581 logging.go:55] [core] [Channel #1350 SubChannel #1351]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.49:9003", ServerName: "10.132.0.49:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.49:9003: operation was canceled" Apr 24 21:49:25.873773 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:25.873692 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d976b1e2-4421-4187-bd4f-0f713f08c1b9" path="/var/lib/kubelet/pods/d976b1e2-4421-4187-bd4f-0f713f08c1b9/volumes" Apr 24 21:49:40.183966 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.183928 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x"] Apr 24 21:49:40.184352 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.184305 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d976b1e2-4421-4187-bd4f-0f713f08c1b9" containerName="main" Apr 24 21:49:40.184352 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.184317 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="d976b1e2-4421-4187-bd4f-0f713f08c1b9" containerName="main" Apr 24 21:49:40.184352 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.184328 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d976b1e2-4421-4187-bd4f-0f713f08c1b9" containerName="storage-initializer" Apr 24 21:49:40.184352 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.184334 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="d976b1e2-4421-4187-bd4f-0f713f08c1b9" containerName="storage-initializer" Apr 24 21:49:40.184352 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.184342 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d976b1e2-4421-4187-bd4f-0f713f08c1b9" containerName="tokenizer" Apr 24 21:49:40.184352 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.184348 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="d976b1e2-4421-4187-bd4f-0f713f08c1b9" containerName="tokenizer" Apr 24 21:49:40.184606 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.184411 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="d976b1e2-4421-4187-bd4f-0f713f08c1b9" containerName="tokenizer" Apr 24 21:49:40.184606 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.184446 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="d976b1e2-4421-4187-bd4f-0f713f08c1b9" containerName="main" Apr 24 21:49:40.187862 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.187840 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:40.190299 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.190243 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 24 21:49:40.190450 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.190305 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-jq95b\"" Apr 24 21:49:40.199642 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.199618 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x"] Apr 24 21:49:40.272980 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.272942 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:40.272980 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.272982 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:40.273195 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.273011 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:40.273195 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.273045 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kw25\" (UniqueName: \"kubernetes.io/projected/bc473620-f287-403d-afca-4608bce01fe4-kube-api-access-6kw25\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:40.273195 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.273071 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:40.273195 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.273183 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bc473620-f287-403d-afca-4608bce01fe4-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:40.374177 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.374144 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bc473620-f287-403d-afca-4608bce01fe4-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:40.374342 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.374204 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:40.374342 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.374229 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:40.374342 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.374255 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:40.374342 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.374296 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kw25\" (UniqueName: \"kubernetes.io/projected/bc473620-f287-403d-afca-4608bce01fe4-kube-api-access-6kw25\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:40.374342 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.374326 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:40.374752 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.374731 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:40.374830 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.374759 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:40.374830 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.374768 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:40.374916 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.374827 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:40.376932 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.376906 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bc473620-f287-403d-afca-4608bce01fe4-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:40.384165 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.384137 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kw25\" (UniqueName: \"kubernetes.io/projected/bc473620-f287-403d-afca-4608bce01fe4-kube-api-access-6kw25\") pod \"stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:40.499409 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.499337 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:40.633634 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:40.633507 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x"] Apr 24 21:49:40.636258 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:49:40.636230 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc473620_f287_403d_afca_4608bce01fe4.slice/crio-b4308f581e5008f18d1fe71e064595e2aff92d749aa889b3ad8be4cdc6ed472d WatchSource:0}: Error finding container b4308f581e5008f18d1fe71e064595e2aff92d749aa889b3ad8be4cdc6ed472d: Status 404 returned error can't find the container with id b4308f581e5008f18d1fe71e064595e2aff92d749aa889b3ad8be4cdc6ed472d Apr 24 21:49:41.296821 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:41.296784 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" event={"ID":"bc473620-f287-403d-afca-4608bce01fe4","Type":"ContainerStarted","Data":"2ad8a92f0c0dd124c51f08aa641003668e9b3edeb391b8feee05ce5697901429"} Apr 24 21:49:41.296821 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:41.296828 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" event={"ID":"bc473620-f287-403d-afca-4608bce01fe4","Type":"ContainerStarted","Data":"b4308f581e5008f18d1fe71e064595e2aff92d749aa889b3ad8be4cdc6ed472d"} Apr 24 21:49:42.301484 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:42.301445 2581 generic.go:358] "Generic (PLEG): container finished" podID="bc473620-f287-403d-afca-4608bce01fe4" containerID="2ad8a92f0c0dd124c51f08aa641003668e9b3edeb391b8feee05ce5697901429" exitCode=0 Apr 24 21:49:42.301847 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:42.301512 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" event={"ID":"bc473620-f287-403d-afca-4608bce01fe4","Type":"ContainerDied","Data":"2ad8a92f0c0dd124c51f08aa641003668e9b3edeb391b8feee05ce5697901429"} Apr 24 21:49:43.307366 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:43.307331 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" event={"ID":"bc473620-f287-403d-afca-4608bce01fe4","Type":"ContainerStarted","Data":"bb8ceeae4caca5adf834a9ca086f6742a2b2d580d6813062210403ace940f5d4"} Apr 24 21:49:43.307366 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:43.307365 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" event={"ID":"bc473620-f287-403d-afca-4608bce01fe4","Type":"ContainerStarted","Data":"b7c3a6ef41a3a82e42a47e63f9c6afe0ab9bb0bfed925cc36f94dce77cd51d71"} Apr 24 21:49:43.307790 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:43.307471 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:43.328581 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:43.328536 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" podStartSLOduration=3.328519951 podStartE2EDuration="3.328519951s" podCreationTimestamp="2026-04-24 21:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:49:43.326846904 +0000 UTC m=+2024.005683579" watchObservedRunningTime="2026-04-24 21:49:43.328519951 +0000 UTC m=+2024.007356623" Apr 24 21:49:50.499820 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:50.499779 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:50.499820 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:50.499829 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:50.502743 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:50.502716 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:49:51.337017 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:49:51.336989 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:50:12.340505 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:50:12.340476 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:50:59.948035 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:50:59.948004 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/1.log" Apr 24 21:50:59.952394 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:50:59.952371 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/1.log" Apr 24 21:50:59.953915 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:50:59.953899 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 21:50:59.957863 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:50:59.957847 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 21:51:31.491144 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:31.491112 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x"] Apr 24 21:51:31.491759 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:31.491533 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" podUID="bc473620-f287-403d-afca-4608bce01fe4" containerName="main" containerID="cri-o://b7c3a6ef41a3a82e42a47e63f9c6afe0ab9bb0bfed925cc36f94dce77cd51d71" gracePeriod=30 Apr 24 21:51:31.491759 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:31.491529 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" podUID="bc473620-f287-403d-afca-4608bce01fe4" containerName="tokenizer" containerID="cri-o://bb8ceeae4caca5adf834a9ca086f6742a2b2d580d6813062210403ace940f5d4" gracePeriod=30 Apr 24 21:51:31.687157 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:31.687120 2581 generic.go:358] "Generic (PLEG): container finished" podID="bc473620-f287-403d-afca-4608bce01fe4" containerID="b7c3a6ef41a3a82e42a47e63f9c6afe0ab9bb0bfed925cc36f94dce77cd51d71" exitCode=0 Apr 24 21:51:31.687323 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:31.687204 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" event={"ID":"bc473620-f287-403d-afca-4608bce01fe4","Type":"ContainerDied","Data":"b7c3a6ef41a3a82e42a47e63f9c6afe0ab9bb0bfed925cc36f94dce77cd51d71"} Apr 24 21:51:32.339452 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:51:32.339389 2581 logging.go:55] [core] [Channel #1487 SubChannel #1488]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.50:9003", ServerName: "10.132.0.50:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.50:9003: connect: connection refused" Apr 24 21:51:32.654022 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.653997 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:51:32.692783 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.692749 2581 generic.go:358] "Generic (PLEG): container finished" podID="bc473620-f287-403d-afca-4608bce01fe4" containerID="bb8ceeae4caca5adf834a9ca086f6742a2b2d580d6813062210403ace940f5d4" exitCode=0 Apr 24 21:51:32.692938 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.692786 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" event={"ID":"bc473620-f287-403d-afca-4608bce01fe4","Type":"ContainerDied","Data":"bb8ceeae4caca5adf834a9ca086f6742a2b2d580d6813062210403ace940f5d4"} Apr 24 21:51:32.692938 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.692814 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" event={"ID":"bc473620-f287-403d-afca-4608bce01fe4","Type":"ContainerDied","Data":"b4308f581e5008f18d1fe71e064595e2aff92d749aa889b3ad8be4cdc6ed472d"} Apr 24 21:51:32.692938 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.692826 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" Apr 24 21:51:32.692938 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.692830 2581 scope.go:117] "RemoveContainer" containerID="bb8ceeae4caca5adf834a9ca086f6742a2b2d580d6813062210403ace940f5d4" Apr 24 21:51:32.703531 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.703510 2581 scope.go:117] "RemoveContainer" containerID="b7c3a6ef41a3a82e42a47e63f9c6afe0ab9bb0bfed925cc36f94dce77cd51d71" Apr 24 21:51:32.711185 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.711167 2581 scope.go:117] "RemoveContainer" containerID="2ad8a92f0c0dd124c51f08aa641003668e9b3edeb391b8feee05ce5697901429" Apr 24 21:51:32.718331 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.718315 2581 scope.go:117] "RemoveContainer" containerID="bb8ceeae4caca5adf834a9ca086f6742a2b2d580d6813062210403ace940f5d4" Apr 24 21:51:32.718548 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:51:32.718531 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb8ceeae4caca5adf834a9ca086f6742a2b2d580d6813062210403ace940f5d4\": container with ID starting with bb8ceeae4caca5adf834a9ca086f6742a2b2d580d6813062210403ace940f5d4 not found: ID does not exist" containerID="bb8ceeae4caca5adf834a9ca086f6742a2b2d580d6813062210403ace940f5d4" Apr 24 21:51:32.718602 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.718556 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb8ceeae4caca5adf834a9ca086f6742a2b2d580d6813062210403ace940f5d4"} err="failed to get container status \"bb8ceeae4caca5adf834a9ca086f6742a2b2d580d6813062210403ace940f5d4\": rpc error: code = NotFound desc = could not find container \"bb8ceeae4caca5adf834a9ca086f6742a2b2d580d6813062210403ace940f5d4\": container with ID starting with bb8ceeae4caca5adf834a9ca086f6742a2b2d580d6813062210403ace940f5d4 not found: ID does not exist" Apr 24 21:51:32.718602 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.718581 2581 scope.go:117] "RemoveContainer" containerID="b7c3a6ef41a3a82e42a47e63f9c6afe0ab9bb0bfed925cc36f94dce77cd51d71" Apr 24 21:51:32.718828 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:51:32.718809 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7c3a6ef41a3a82e42a47e63f9c6afe0ab9bb0bfed925cc36f94dce77cd51d71\": container with ID starting with b7c3a6ef41a3a82e42a47e63f9c6afe0ab9bb0bfed925cc36f94dce77cd51d71 not found: ID does not exist" containerID="b7c3a6ef41a3a82e42a47e63f9c6afe0ab9bb0bfed925cc36f94dce77cd51d71" Apr 24 21:51:32.718873 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.718835 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7c3a6ef41a3a82e42a47e63f9c6afe0ab9bb0bfed925cc36f94dce77cd51d71"} err="failed to get container status \"b7c3a6ef41a3a82e42a47e63f9c6afe0ab9bb0bfed925cc36f94dce77cd51d71\": rpc error: code = NotFound desc = could not find container \"b7c3a6ef41a3a82e42a47e63f9c6afe0ab9bb0bfed925cc36f94dce77cd51d71\": container with ID starting with b7c3a6ef41a3a82e42a47e63f9c6afe0ab9bb0bfed925cc36f94dce77cd51d71 not found: ID does not exist" Apr 24 21:51:32.718873 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.718851 2581 scope.go:117] "RemoveContainer" containerID="2ad8a92f0c0dd124c51f08aa641003668e9b3edeb391b8feee05ce5697901429" Apr 24 21:51:32.719051 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:51:32.719035 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ad8a92f0c0dd124c51f08aa641003668e9b3edeb391b8feee05ce5697901429\": container with ID starting with 2ad8a92f0c0dd124c51f08aa641003668e9b3edeb391b8feee05ce5697901429 not found: ID does not exist" containerID="2ad8a92f0c0dd124c51f08aa641003668e9b3edeb391b8feee05ce5697901429" Apr 24 21:51:32.719096 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.719055 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ad8a92f0c0dd124c51f08aa641003668e9b3edeb391b8feee05ce5697901429"} err="failed to get container status \"2ad8a92f0c0dd124c51f08aa641003668e9b3edeb391b8feee05ce5697901429\": rpc error: code = NotFound desc = could not find container \"2ad8a92f0c0dd124c51f08aa641003668e9b3edeb391b8feee05ce5697901429\": container with ID starting with 2ad8a92f0c0dd124c51f08aa641003668e9b3edeb391b8feee05ce5697901429 not found: ID does not exist" Apr 24 21:51:32.817674 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.817646 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-tokenizer-cache\") pod \"bc473620-f287-403d-afca-4608bce01fe4\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " Apr 24 21:51:32.817812 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.817681 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-kserve-provision-location\") pod \"bc473620-f287-403d-afca-4608bce01fe4\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " Apr 24 21:51:32.817812 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.817737 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-tokenizer-uds\") pod \"bc473620-f287-403d-afca-4608bce01fe4\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " Apr 24 21:51:32.817812 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.817754 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bc473620-f287-403d-afca-4608bce01fe4-tls-certs\") pod \"bc473620-f287-403d-afca-4608bce01fe4\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " Apr 24 21:51:32.817812 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.817779 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kw25\" (UniqueName: \"kubernetes.io/projected/bc473620-f287-403d-afca-4608bce01fe4-kube-api-access-6kw25\") pod \"bc473620-f287-403d-afca-4608bce01fe4\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " Apr 24 21:51:32.817812 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.817803 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-tokenizer-tmp\") pod \"bc473620-f287-403d-afca-4608bce01fe4\" (UID: \"bc473620-f287-403d-afca-4608bce01fe4\") " Apr 24 21:51:32.818071 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.817959 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "bc473620-f287-403d-afca-4608bce01fe4" (UID: "bc473620-f287-403d-afca-4608bce01fe4"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:51:32.818071 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.817977 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "bc473620-f287-403d-afca-4608bce01fe4" (UID: "bc473620-f287-403d-afca-4608bce01fe4"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:51:32.818071 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.818060 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-tokenizer-cache\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:51:32.818222 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.818073 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-tokenizer-uds\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:51:32.818301 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.818272 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "bc473620-f287-403d-afca-4608bce01fe4" (UID: "bc473620-f287-403d-afca-4608bce01fe4"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:51:32.818576 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.818549 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bc473620-f287-403d-afca-4608bce01fe4" (UID: "bc473620-f287-403d-afca-4608bce01fe4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:51:32.819816 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.819795 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc473620-f287-403d-afca-4608bce01fe4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "bc473620-f287-403d-afca-4608bce01fe4" (UID: "bc473620-f287-403d-afca-4608bce01fe4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:51:32.819909 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.819885 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc473620-f287-403d-afca-4608bce01fe4-kube-api-access-6kw25" (OuterVolumeSpecName: "kube-api-access-6kw25") pod "bc473620-f287-403d-afca-4608bce01fe4" (UID: "bc473620-f287-403d-afca-4608bce01fe4"). InnerVolumeSpecName "kube-api-access-6kw25". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:51:32.918871 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.918797 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bc473620-f287-403d-afca-4608bce01fe4-tls-certs\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:51:32.918871 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.918826 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6kw25\" (UniqueName: \"kubernetes.io/projected/bc473620-f287-403d-afca-4608bce01fe4-kube-api-access-6kw25\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:51:32.918871 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.918840 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-tokenizer-tmp\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:51:32.918871 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:32.918853 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc473620-f287-403d-afca-4608bce01fe4-kserve-provision-location\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:51:33.015962 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:33.015923 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x"] Apr 24 21:51:33.020001 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:33.019978 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x"] Apr 24 21:51:33.339339 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:33.339296 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5c6896cff6-vqv5x" podUID="bc473620-f287-403d-afca-4608bce01fe4" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.50:9003\" within 1s: context deadline exceeded" Apr 24 21:51:33.874884 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:51:33.874848 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc473620-f287-403d-afca-4608bce01fe4" path="/var/lib/kubelet/pods/bc473620-f287-403d-afca-4608bce01fe4/volumes" Apr 24 21:52:09.471886 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:52:09.471845 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-6c9547c57-hr9ff"] Apr 24 21:52:09.472491 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:52:09.472177 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-6c9547c57-hr9ff" podUID="81a8ae8b-f98a-4240-9832-35069d2f1c6e" containerName="manager" containerID="cri-o://1abaa4c7512bf3bd19daf7937df75d85d6565e9885b1c764de4b128b58c73e52" gracePeriod=30 Apr 24 21:52:09.720153 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:52:09.720131 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6c9547c57-hr9ff" Apr 24 21:52:09.797908 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:52:09.797839 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81a8ae8b-f98a-4240-9832-35069d2f1c6e-cert\") pod \"81a8ae8b-f98a-4240-9832-35069d2f1c6e\" (UID: \"81a8ae8b-f98a-4240-9832-35069d2f1c6e\") " Apr 24 21:52:09.797908 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:52:09.797881 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dth2c\" (UniqueName: \"kubernetes.io/projected/81a8ae8b-f98a-4240-9832-35069d2f1c6e-kube-api-access-dth2c\") pod \"81a8ae8b-f98a-4240-9832-35069d2f1c6e\" (UID: \"81a8ae8b-f98a-4240-9832-35069d2f1c6e\") " Apr 24 21:52:09.799940 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:52:09.799915 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81a8ae8b-f98a-4240-9832-35069d2f1c6e-cert" (OuterVolumeSpecName: "cert") pod "81a8ae8b-f98a-4240-9832-35069d2f1c6e" (UID: "81a8ae8b-f98a-4240-9832-35069d2f1c6e"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:52:09.800043 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:52:09.799986 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81a8ae8b-f98a-4240-9832-35069d2f1c6e-kube-api-access-dth2c" (OuterVolumeSpecName: "kube-api-access-dth2c") pod "81a8ae8b-f98a-4240-9832-35069d2f1c6e" (UID: "81a8ae8b-f98a-4240-9832-35069d2f1c6e"). InnerVolumeSpecName "kube-api-access-dth2c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:52:09.816907 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:52:09.816876 2581 generic.go:358] "Generic (PLEG): container finished" podID="81a8ae8b-f98a-4240-9832-35069d2f1c6e" containerID="1abaa4c7512bf3bd19daf7937df75d85d6565e9885b1c764de4b128b58c73e52" exitCode=0 Apr 24 21:52:09.817006 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:52:09.816919 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6c9547c57-hr9ff" event={"ID":"81a8ae8b-f98a-4240-9832-35069d2f1c6e","Type":"ContainerDied","Data":"1abaa4c7512bf3bd19daf7937df75d85d6565e9885b1c764de4b128b58c73e52"} Apr 24 21:52:09.817006 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:52:09.816940 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6c9547c57-hr9ff" Apr 24 21:52:09.817006 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:52:09.816945 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6c9547c57-hr9ff" event={"ID":"81a8ae8b-f98a-4240-9832-35069d2f1c6e","Type":"ContainerDied","Data":"c05cc34ab2ca1505ad21f383f829c76d570c65cd849ed08bbfe36cc34d950e82"} Apr 24 21:52:09.817006 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:52:09.816966 2581 scope.go:117] "RemoveContainer" containerID="1abaa4c7512bf3bd19daf7937df75d85d6565e9885b1c764de4b128b58c73e52" Apr 24 21:52:09.825668 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:52:09.825648 2581 scope.go:117] "RemoveContainer" containerID="1abaa4c7512bf3bd19daf7937df75d85d6565e9885b1c764de4b128b58c73e52" Apr 24 21:52:09.825919 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:52:09.825892 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1abaa4c7512bf3bd19daf7937df75d85d6565e9885b1c764de4b128b58c73e52\": container with ID starting with 1abaa4c7512bf3bd19daf7937df75d85d6565e9885b1c764de4b128b58c73e52 not found: ID does not exist" containerID="1abaa4c7512bf3bd19daf7937df75d85d6565e9885b1c764de4b128b58c73e52" Apr 24 21:52:09.825972 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:52:09.825919 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1abaa4c7512bf3bd19daf7937df75d85d6565e9885b1c764de4b128b58c73e52"} err="failed to get container status \"1abaa4c7512bf3bd19daf7937df75d85d6565e9885b1c764de4b128b58c73e52\": rpc error: code = NotFound desc = could not find container \"1abaa4c7512bf3bd19daf7937df75d85d6565e9885b1c764de4b128b58c73e52\": container with ID starting with 1abaa4c7512bf3bd19daf7937df75d85d6565e9885b1c764de4b128b58c73e52 not found: ID does not exist" Apr 24 21:52:09.836492 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:52:09.836470 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-6c9547c57-hr9ff"] Apr 24 21:52:09.841930 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:52:09.841906 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-6c9547c57-hr9ff"] Apr 24 21:52:09.873940 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:52:09.873918 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81a8ae8b-f98a-4240-9832-35069d2f1c6e" path="/var/lib/kubelet/pods/81a8ae8b-f98a-4240-9832-35069d2f1c6e/volumes" Apr 24 21:52:09.899183 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:52:09.899158 2581 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81a8ae8b-f98a-4240-9832-35069d2f1c6e-cert\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:52:09.899183 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:52:09.899184 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dth2c\" (UniqueName: \"kubernetes.io/projected/81a8ae8b-f98a-4240-9832-35069d2f1c6e-kube-api-access-dth2c\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:55:11.414251 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:11.414218 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4"] Apr 24 21:55:11.416833 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:11.414556 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" podUID="b6ba6b20-9402-4048-a45c-d9f1f48a0346" containerName="main" containerID="cri-o://0004b535bc817ce76594e962e5752c5ed56193c6de8d68be87d92c03ff9a376b" gracePeriod=30 Apr 24 21:55:11.416833 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:11.414627 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" podUID="b6ba6b20-9402-4048-a45c-d9f1f48a0346" containerName="tokenizer" containerID="cri-o://cbc62192406fa84a1119e1deb128ec99748caefdd0a82858f6d944cbe5057423" gracePeriod=30 Apr 24 21:55:12.449781 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:12.449750 2581 generic.go:358] "Generic (PLEG): container finished" podID="b6ba6b20-9402-4048-a45c-d9f1f48a0346" containerID="0004b535bc817ce76594e962e5752c5ed56193c6de8d68be87d92c03ff9a376b" exitCode=0 Apr 24 21:55:12.450131 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:12.449821 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" event={"ID":"b6ba6b20-9402-4048-a45c-d9f1f48a0346","Type":"ContainerDied","Data":"0004b535bc817ce76594e962e5752c5ed56193c6de8d68be87d92c03ff9a376b"} Apr 24 21:55:12.680544 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:12.680524 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:55:12.784400 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:12.784199 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmdgk\" (UniqueName: \"kubernetes.io/projected/b6ba6b20-9402-4048-a45c-d9f1f48a0346-kube-api-access-kmdgk\") pod \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " Apr 24 21:55:12.784400 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:12.784267 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tokenizer-cache\") pod \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " Apr 24 21:55:12.784400 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:12.784340 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-kserve-provision-location\") pod \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " Apr 24 21:55:12.784400 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:12.784367 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tokenizer-tmp\") pod \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " Apr 24 21:55:12.784400 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:12.784396 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tls-certs\") pod \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " Apr 24 21:55:12.784798 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:12.784456 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tokenizer-uds\") pod \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\" (UID: \"b6ba6b20-9402-4048-a45c-d9f1f48a0346\") " Apr 24 21:55:12.785035 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:12.784933 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "b6ba6b20-9402-4048-a45c-d9f1f48a0346" (UID: "b6ba6b20-9402-4048-a45c-d9f1f48a0346"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:55:12.786078 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:12.785805 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b6ba6b20-9402-4048-a45c-d9f1f48a0346" (UID: "b6ba6b20-9402-4048-a45c-d9f1f48a0346"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:55:12.786885 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:12.786715 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "b6ba6b20-9402-4048-a45c-d9f1f48a0346" (UID: "b6ba6b20-9402-4048-a45c-d9f1f48a0346"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:55:12.787055 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:12.787038 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "b6ba6b20-9402-4048-a45c-d9f1f48a0346" (UID: "b6ba6b20-9402-4048-a45c-d9f1f48a0346"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:55:12.788715 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:12.788667 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6ba6b20-9402-4048-a45c-d9f1f48a0346-kube-api-access-kmdgk" (OuterVolumeSpecName: "kube-api-access-kmdgk") pod "b6ba6b20-9402-4048-a45c-d9f1f48a0346" (UID: "b6ba6b20-9402-4048-a45c-d9f1f48a0346"). InnerVolumeSpecName "kube-api-access-kmdgk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:55:12.788804 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:12.788724 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b6ba6b20-9402-4048-a45c-d9f1f48a0346" (UID: "b6ba6b20-9402-4048-a45c-d9f1f48a0346"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:55:12.885994 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:12.885955 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kmdgk\" (UniqueName: \"kubernetes.io/projected/b6ba6b20-9402-4048-a45c-d9f1f48a0346-kube-api-access-kmdgk\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:55:12.885994 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:12.885992 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tokenizer-cache\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:55:12.886195 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:12.886006 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-kserve-provision-location\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:55:12.886195 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:12.886019 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tokenizer-tmp\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:55:12.886195 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:12.886033 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tls-certs\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:55:12.886195 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:12.886046 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b6ba6b20-9402-4048-a45c-d9f1f48a0346-tokenizer-uds\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 21:55:13.454825 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:13.454790 2581 generic.go:358] "Generic (PLEG): container finished" podID="b6ba6b20-9402-4048-a45c-d9f1f48a0346" containerID="cbc62192406fa84a1119e1deb128ec99748caefdd0a82858f6d944cbe5057423" exitCode=0 Apr 24 21:55:13.455252 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:13.454859 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" event={"ID":"b6ba6b20-9402-4048-a45c-d9f1f48a0346","Type":"ContainerDied","Data":"cbc62192406fa84a1119e1deb128ec99748caefdd0a82858f6d944cbe5057423"} Apr 24 21:55:13.455252 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:13.454887 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" event={"ID":"b6ba6b20-9402-4048-a45c-d9f1f48a0346","Type":"ContainerDied","Data":"5fb74dc453fc84ed6378e66d4938b66f442b2e358c83d49abc0bcbddede3caab"} Apr 24 21:55:13.455252 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:13.454892 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4" Apr 24 21:55:13.455252 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:13.454902 2581 scope.go:117] "RemoveContainer" containerID="cbc62192406fa84a1119e1deb128ec99748caefdd0a82858f6d944cbe5057423" Apr 24 21:55:13.464840 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:13.464820 2581 scope.go:117] "RemoveContainer" containerID="0004b535bc817ce76594e962e5752c5ed56193c6de8d68be87d92c03ff9a376b" Apr 24 21:55:13.474442 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:13.474383 2581 scope.go:117] "RemoveContainer" containerID="dc995fa59a334fe19190d93b3e229123395fb7f2100c1787b8a755bb688d00b5" Apr 24 21:55:13.477205 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:13.477184 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4"] Apr 24 21:55:13.480526 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:13.480493 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-784bczj4c4"] Apr 24 21:55:13.483328 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:13.483315 2581 scope.go:117] "RemoveContainer" containerID="cbc62192406fa84a1119e1deb128ec99748caefdd0a82858f6d944cbe5057423" Apr 24 21:55:13.483594 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:55:13.483572 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbc62192406fa84a1119e1deb128ec99748caefdd0a82858f6d944cbe5057423\": container with ID starting with cbc62192406fa84a1119e1deb128ec99748caefdd0a82858f6d944cbe5057423 not found: ID does not exist" containerID="cbc62192406fa84a1119e1deb128ec99748caefdd0a82858f6d944cbe5057423" Apr 24 21:55:13.483652 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:13.483602 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbc62192406fa84a1119e1deb128ec99748caefdd0a82858f6d944cbe5057423"} err="failed to get container status \"cbc62192406fa84a1119e1deb128ec99748caefdd0a82858f6d944cbe5057423\": rpc error: code = NotFound desc = could not find container \"cbc62192406fa84a1119e1deb128ec99748caefdd0a82858f6d944cbe5057423\": container with ID starting with cbc62192406fa84a1119e1deb128ec99748caefdd0a82858f6d944cbe5057423 not found: ID does not exist" Apr 24 21:55:13.483652 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:13.483626 2581 scope.go:117] "RemoveContainer" containerID="0004b535bc817ce76594e962e5752c5ed56193c6de8d68be87d92c03ff9a376b" Apr 24 21:55:13.483888 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:55:13.483870 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0004b535bc817ce76594e962e5752c5ed56193c6de8d68be87d92c03ff9a376b\": container with ID starting with 0004b535bc817ce76594e962e5752c5ed56193c6de8d68be87d92c03ff9a376b not found: ID does not exist" containerID="0004b535bc817ce76594e962e5752c5ed56193c6de8d68be87d92c03ff9a376b" Apr 24 21:55:13.483932 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:13.483896 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0004b535bc817ce76594e962e5752c5ed56193c6de8d68be87d92c03ff9a376b"} err="failed to get container status \"0004b535bc817ce76594e962e5752c5ed56193c6de8d68be87d92c03ff9a376b\": rpc error: code = NotFound desc = could not find container \"0004b535bc817ce76594e962e5752c5ed56193c6de8d68be87d92c03ff9a376b\": container with ID starting with 0004b535bc817ce76594e962e5752c5ed56193c6de8d68be87d92c03ff9a376b not found: ID does not exist" Apr 24 21:55:13.483932 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:13.483911 2581 scope.go:117] "RemoveContainer" containerID="dc995fa59a334fe19190d93b3e229123395fb7f2100c1787b8a755bb688d00b5" Apr 24 21:55:13.484126 ip-10-0-139-15 kubenswrapper[2581]: E0424 21:55:13.484111 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc995fa59a334fe19190d93b3e229123395fb7f2100c1787b8a755bb688d00b5\": container with ID starting with dc995fa59a334fe19190d93b3e229123395fb7f2100c1787b8a755bb688d00b5 not found: ID does not exist" containerID="dc995fa59a334fe19190d93b3e229123395fb7f2100c1787b8a755bb688d00b5" Apr 24 21:55:13.484167 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:13.484129 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc995fa59a334fe19190d93b3e229123395fb7f2100c1787b8a755bb688d00b5"} err="failed to get container status \"dc995fa59a334fe19190d93b3e229123395fb7f2100c1787b8a755bb688d00b5\": rpc error: code = NotFound desc = could not find container \"dc995fa59a334fe19190d93b3e229123395fb7f2100c1787b8a755bb688d00b5\": container with ID starting with dc995fa59a334fe19190d93b3e229123395fb7f2100c1787b8a755bb688d00b5 not found: ID does not exist" Apr 24 21:55:13.879480 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:13.879380 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6ba6b20-9402-4048-a45c-d9f1f48a0346" path="/var/lib/kubelet/pods/b6ba6b20-9402-4048-a45c-d9f1f48a0346/volumes" Apr 24 21:55:36.174172 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.174133 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl"] Apr 24 21:55:36.174564 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.174514 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6ba6b20-9402-4048-a45c-d9f1f48a0346" containerName="storage-initializer" Apr 24 21:55:36.174564 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.174526 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6ba6b20-9402-4048-a45c-d9f1f48a0346" containerName="storage-initializer" Apr 24 21:55:36.174564 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.174535 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6ba6b20-9402-4048-a45c-d9f1f48a0346" containerName="tokenizer" Apr 24 21:55:36.174564 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.174541 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6ba6b20-9402-4048-a45c-d9f1f48a0346" containerName="tokenizer" Apr 24 21:55:36.174564 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.174553 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81a8ae8b-f98a-4240-9832-35069d2f1c6e" containerName="manager" Apr 24 21:55:36.174564 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.174559 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="81a8ae8b-f98a-4240-9832-35069d2f1c6e" containerName="manager" Apr 24 21:55:36.174564 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.174565 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6ba6b20-9402-4048-a45c-d9f1f48a0346" containerName="main" Apr 24 21:55:36.174783 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.174571 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6ba6b20-9402-4048-a45c-d9f1f48a0346" containerName="main" Apr 24 21:55:36.174783 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.174577 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc473620-f287-403d-afca-4608bce01fe4" containerName="main" Apr 24 21:55:36.174783 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.174582 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc473620-f287-403d-afca-4608bce01fe4" containerName="main" Apr 24 21:55:36.174783 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.174597 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc473620-f287-403d-afca-4608bce01fe4" containerName="storage-initializer" Apr 24 21:55:36.174783 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.174603 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc473620-f287-403d-afca-4608bce01fe4" containerName="storage-initializer" Apr 24 21:55:36.174783 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.174611 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc473620-f287-403d-afca-4608bce01fe4" containerName="tokenizer" Apr 24 21:55:36.174783 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.174615 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc473620-f287-403d-afca-4608bce01fe4" containerName="tokenizer" Apr 24 21:55:36.174783 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.174668 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc473620-f287-403d-afca-4608bce01fe4" containerName="main" Apr 24 21:55:36.174783 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.174675 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="81a8ae8b-f98a-4240-9832-35069d2f1c6e" containerName="manager" Apr 24 21:55:36.174783 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.174682 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="b6ba6b20-9402-4048-a45c-d9f1f48a0346" containerName="tokenizer" Apr 24 21:55:36.174783 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.174689 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="b6ba6b20-9402-4048-a45c-d9f1f48a0346" containerName="main" Apr 24 21:55:36.174783 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.174697 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc473620-f287-403d-afca-4608bce01fe4" containerName="tokenizer" Apr 24 21:55:36.178627 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.178604 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:36.180905 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.180809 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-hhfkp\"" Apr 24 21:55:36.180905 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.180837 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 24 21:55:36.180905 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.180857 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:55:36.181843 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.181823 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-qdwck\"" Apr 24 21:55:36.181946 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.181827 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:55:36.188180 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.188156 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl"] Apr 24 21:55:36.281064 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.281025 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:36.281064 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.281065 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:36.281300 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.281173 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:36.281300 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.281208 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dngl\" (UniqueName: \"kubernetes.io/projected/dcad4a5b-bda3-4020-8a36-88141b649074-kube-api-access-8dngl\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:36.281300 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.281243 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:36.281300 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.281281 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dcad4a5b-bda3-4020-8a36-88141b649074-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:36.381819 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.381780 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:36.382003 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.381847 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dngl\" (UniqueName: \"kubernetes.io/projected/dcad4a5b-bda3-4020-8a36-88141b649074-kube-api-access-8dngl\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:36.382003 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.381886 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:36.382003 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.381920 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dcad4a5b-bda3-4020-8a36-88141b649074-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:36.382003 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.381969 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:36.382003 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.381995 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:36.382267 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.382216 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:36.382267 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.382227 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:36.382390 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.382297 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:36.382483 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.382416 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:36.384359 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.384342 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dcad4a5b-bda3-4020-8a36-88141b649074-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:36.390919 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.390894 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dngl\" (UniqueName: \"kubernetes.io/projected/dcad4a5b-bda3-4020-8a36-88141b649074-kube-api-access-8dngl\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:36.490024 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.489947 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:36.621841 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.621816 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl"] Apr 24 21:55:36.624576 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:55:36.624544 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcad4a5b_bda3_4020_8a36_88141b649074.slice/crio-9c4ebfca35bba8b7413995fb316f974705e47df794e3ad71cc4a5619b2bb577d WatchSource:0}: Error finding container 9c4ebfca35bba8b7413995fb316f974705e47df794e3ad71cc4a5619b2bb577d: Status 404 returned error can't find the container with id 9c4ebfca35bba8b7413995fb316f974705e47df794e3ad71cc4a5619b2bb577d Apr 24 21:55:36.626546 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:36.626525 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:55:37.548671 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:37.548640 2581 generic.go:358] "Generic (PLEG): container finished" podID="dcad4a5b-bda3-4020-8a36-88141b649074" containerID="1dc0c97ca6ad3a5e320c87f64d8b84dd28968ec2ae630c4e46ac90b3c86b69c0" exitCode=0 Apr 24 21:55:37.549028 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:37.548723 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" event={"ID":"dcad4a5b-bda3-4020-8a36-88141b649074","Type":"ContainerDied","Data":"1dc0c97ca6ad3a5e320c87f64d8b84dd28968ec2ae630c4e46ac90b3c86b69c0"} Apr 24 21:55:37.549028 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:37.548757 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" event={"ID":"dcad4a5b-bda3-4020-8a36-88141b649074","Type":"ContainerStarted","Data":"9c4ebfca35bba8b7413995fb316f974705e47df794e3ad71cc4a5619b2bb577d"} Apr 24 21:55:38.554262 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:38.554219 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" event={"ID":"dcad4a5b-bda3-4020-8a36-88141b649074","Type":"ContainerStarted","Data":"b7d7539fedffdb7e2d95bff584c573068a6a31451838ab370cf648d127b29448"} Apr 24 21:55:38.554262 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:38.554264 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" event={"ID":"dcad4a5b-bda3-4020-8a36-88141b649074","Type":"ContainerStarted","Data":"3553d47c00c274e9116daac9ade7ddf512c301d1c206550221015ac9f3e458eb"} Apr 24 21:55:38.554737 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:38.554296 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:38.575915 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:38.575865 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" podStartSLOduration=2.57585187 podStartE2EDuration="2.57585187s" podCreationTimestamp="2026-04-24 21:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:55:38.574065387 +0000 UTC m=+2379.252902074" watchObservedRunningTime="2026-04-24 21:55:38.57585187 +0000 UTC m=+2379.254688541" Apr 24 21:55:46.490206 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:46.490177 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:46.490774 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:46.490217 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:46.493040 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:46.493016 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:46.589808 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:46.589780 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:55:59.972243 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:59.972207 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/1.log" Apr 24 21:55:59.977225 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:59.977203 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/1.log" Apr 24 21:55:59.977649 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:59.977631 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 21:55:59.982285 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:55:59.982269 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 21:56:07.593558 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:07.593527 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 21:56:42.051580 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.051543 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx"] Apr 24 21:56:42.054195 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.054173 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:42.058133 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.058109 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-pjfzt\"" Apr 24 21:56:42.058241 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.058200 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 24 21:56:42.067719 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.067697 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx"] Apr 24 21:56:42.121296 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.121259 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fdc2520c-5e80-4345-bd52-dc4457adbad8-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:42.121509 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.121347 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:42.121509 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.121380 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:42.121509 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.121403 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:42.121509 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.121462 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:42.121658 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.121516 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5bj9\" (UniqueName: \"kubernetes.io/projected/fdc2520c-5e80-4345-bd52-dc4457adbad8-kube-api-access-g5bj9\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:42.222795 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.222764 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:42.222795 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.222801 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:42.223024 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.222846 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:42.223024 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.222873 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:42.223024 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.222893 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5bj9\" (UniqueName: \"kubernetes.io/projected/fdc2520c-5e80-4345-bd52-dc4457adbad8-kube-api-access-g5bj9\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:42.223024 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.222962 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fdc2520c-5e80-4345-bd52-dc4457adbad8-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:42.223243 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.223144 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:42.223243 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.223222 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:42.223587 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.223269 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:42.223784 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.223757 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:42.229443 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.226665 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fdc2520c-5e80-4345-bd52-dc4457adbad8-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:42.233396 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.233372 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5bj9\" (UniqueName: \"kubernetes.io/projected/fdc2520c-5e80-4345-bd52-dc4457adbad8-kube-api-access-g5bj9\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:42.364614 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.364528 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:42.490866 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.490830 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx"] Apr 24 21:56:42.494623 ip-10-0-139-15 kubenswrapper[2581]: W0424 21:56:42.494598 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdc2520c_5e80_4345_bd52_dc4457adbad8.slice/crio-5f42b7bbaecdd0b4f40e9999125ce132e9375e23fbceee524ebda5c27cd7a5eb WatchSource:0}: Error finding container 5f42b7bbaecdd0b4f40e9999125ce132e9375e23fbceee524ebda5c27cd7a5eb: Status 404 returned error can't find the container with id 5f42b7bbaecdd0b4f40e9999125ce132e9375e23fbceee524ebda5c27cd7a5eb Apr 24 21:56:42.779612 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.779568 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" event={"ID":"fdc2520c-5e80-4345-bd52-dc4457adbad8","Type":"ContainerStarted","Data":"7d7fc360e40f62162fcf01dd6313b11875e08db7fa949a4a6c91392d60e33370"} Apr 24 21:56:42.779612 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:42.779612 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" event={"ID":"fdc2520c-5e80-4345-bd52-dc4457adbad8","Type":"ContainerStarted","Data":"5f42b7bbaecdd0b4f40e9999125ce132e9375e23fbceee524ebda5c27cd7a5eb"} Apr 24 21:56:43.786853 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:43.786696 2581 generic.go:358] "Generic (PLEG): container finished" podID="fdc2520c-5e80-4345-bd52-dc4457adbad8" containerID="7d7fc360e40f62162fcf01dd6313b11875e08db7fa949a4a6c91392d60e33370" exitCode=0 Apr 24 21:56:43.786853 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:43.786788 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" event={"ID":"fdc2520c-5e80-4345-bd52-dc4457adbad8","Type":"ContainerDied","Data":"7d7fc360e40f62162fcf01dd6313b11875e08db7fa949a4a6c91392d60e33370"} Apr 24 21:56:44.792468 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:44.792401 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" event={"ID":"fdc2520c-5e80-4345-bd52-dc4457adbad8","Type":"ContainerStarted","Data":"2d1b9b1258816004bb66aee7acb6d3d8292b6bcb2ee2d32c8d3d1cfd79929240"} Apr 24 21:56:44.792861 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:44.792473 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" event={"ID":"fdc2520c-5e80-4345-bd52-dc4457adbad8","Type":"ContainerStarted","Data":"f258635c2d08504ba96553de5cf07a6e5344297cc04844b3ade5d1bcdd22a585"} Apr 24 21:56:44.792861 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:44.792571 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:44.811527 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:44.811475 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" podStartSLOduration=2.811460263 podStartE2EDuration="2.811460263s" podCreationTimestamp="2026-04-24 21:56:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:56:44.810546868 +0000 UTC m=+2445.489383561" watchObservedRunningTime="2026-04-24 21:56:44.811460263 +0000 UTC m=+2445.490296936" Apr 24 21:56:52.365439 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:52.365393 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:52.365981 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:52.365459 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:52.368246 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:52.368223 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:56:52.824217 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:56:52.824185 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 21:57:13.828269 ip-10-0-139-15 kubenswrapper[2581]: I0424 21:57:13.828238 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 22:00:24.034814 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:24.034778 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl"] Apr 24 22:00:24.035549 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:24.035514 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" podUID="dcad4a5b-bda3-4020-8a36-88141b649074" containerName="main" containerID="cri-o://3553d47c00c274e9116daac9ade7ddf512c301d1c206550221015ac9f3e458eb" gracePeriod=30 Apr 24 22:00:24.035808 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:24.035586 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" podUID="dcad4a5b-bda3-4020-8a36-88141b649074" containerName="tokenizer" containerID="cri-o://b7d7539fedffdb7e2d95bff584c573068a6a31451838ab370cf648d127b29448" gracePeriod=30 Apr 24 22:00:24.547622 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:24.547585 2581 generic.go:358] "Generic (PLEG): container finished" podID="dcad4a5b-bda3-4020-8a36-88141b649074" containerID="3553d47c00c274e9116daac9ade7ddf512c301d1c206550221015ac9f3e458eb" exitCode=0 Apr 24 22:00:24.547793 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:24.547662 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" event={"ID":"dcad4a5b-bda3-4020-8a36-88141b649074","Type":"ContainerDied","Data":"3553d47c00c274e9116daac9ade7ddf512c301d1c206550221015ac9f3e458eb"} Apr 24 22:00:25.271718 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.271695 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 22:00:25.371305 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.371228 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-tokenizer-tmp\") pod \"dcad4a5b-bda3-4020-8a36-88141b649074\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " Apr 24 22:00:25.371305 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.371259 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-kserve-provision-location\") pod \"dcad4a5b-bda3-4020-8a36-88141b649074\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " Apr 24 22:00:25.371305 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.371280 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dngl\" (UniqueName: \"kubernetes.io/projected/dcad4a5b-bda3-4020-8a36-88141b649074-kube-api-access-8dngl\") pod \"dcad4a5b-bda3-4020-8a36-88141b649074\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " Apr 24 22:00:25.371305 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.371299 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dcad4a5b-bda3-4020-8a36-88141b649074-tls-certs\") pod \"dcad4a5b-bda3-4020-8a36-88141b649074\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " Apr 24 22:00:25.371636 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.371390 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-tokenizer-uds\") pod \"dcad4a5b-bda3-4020-8a36-88141b649074\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " Apr 24 22:00:25.371636 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.371482 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-tokenizer-cache\") pod \"dcad4a5b-bda3-4020-8a36-88141b649074\" (UID: \"dcad4a5b-bda3-4020-8a36-88141b649074\") " Apr 24 22:00:25.371732 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.371661 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "dcad4a5b-bda3-4020-8a36-88141b649074" (UID: "dcad4a5b-bda3-4020-8a36-88141b649074"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:25.371732 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.371677 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "dcad4a5b-bda3-4020-8a36-88141b649074" (UID: "dcad4a5b-bda3-4020-8a36-88141b649074"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:25.371732 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.371713 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "dcad4a5b-bda3-4020-8a36-88141b649074" (UID: "dcad4a5b-bda3-4020-8a36-88141b649074"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:25.371874 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.371845 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-tokenizer-uds\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 22:00:25.371874 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.371858 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-tokenizer-cache\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 22:00:25.371874 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.371867 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-tokenizer-tmp\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 22:00:25.372088 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.372065 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dcad4a5b-bda3-4020-8a36-88141b649074" (UID: "dcad4a5b-bda3-4020-8a36-88141b649074"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:25.373439 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.373400 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcad4a5b-bda3-4020-8a36-88141b649074-kube-api-access-8dngl" (OuterVolumeSpecName: "kube-api-access-8dngl") pod "dcad4a5b-bda3-4020-8a36-88141b649074" (UID: "dcad4a5b-bda3-4020-8a36-88141b649074"). InnerVolumeSpecName "kube-api-access-8dngl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:00:25.373549 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.373501 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcad4a5b-bda3-4020-8a36-88141b649074-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "dcad4a5b-bda3-4020-8a36-88141b649074" (UID: "dcad4a5b-bda3-4020-8a36-88141b649074"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:00:25.472246 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.472217 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dcad4a5b-bda3-4020-8a36-88141b649074-kserve-provision-location\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 22:00:25.472246 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.472243 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8dngl\" (UniqueName: \"kubernetes.io/projected/dcad4a5b-bda3-4020-8a36-88141b649074-kube-api-access-8dngl\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 22:00:25.472442 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.472254 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dcad4a5b-bda3-4020-8a36-88141b649074-tls-certs\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 22:00:25.552453 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.552404 2581 generic.go:358] "Generic (PLEG): container finished" podID="dcad4a5b-bda3-4020-8a36-88141b649074" containerID="b7d7539fedffdb7e2d95bff584c573068a6a31451838ab370cf648d127b29448" exitCode=0 Apr 24 22:00:25.552594 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.552503 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" Apr 24 22:00:25.552594 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.552502 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" event={"ID":"dcad4a5b-bda3-4020-8a36-88141b649074","Type":"ContainerDied","Data":"b7d7539fedffdb7e2d95bff584c573068a6a31451838ab370cf648d127b29448"} Apr 24 22:00:25.552669 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.552612 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl" event={"ID":"dcad4a5b-bda3-4020-8a36-88141b649074","Type":"ContainerDied","Data":"9c4ebfca35bba8b7413995fb316f974705e47df794e3ad71cc4a5619b2bb577d"} Apr 24 22:00:25.552669 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.552634 2581 scope.go:117] "RemoveContainer" containerID="b7d7539fedffdb7e2d95bff584c573068a6a31451838ab370cf648d127b29448" Apr 24 22:00:25.561552 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.561534 2581 scope.go:117] "RemoveContainer" containerID="3553d47c00c274e9116daac9ade7ddf512c301d1c206550221015ac9f3e458eb" Apr 24 22:00:25.569012 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.568996 2581 scope.go:117] "RemoveContainer" containerID="1dc0c97ca6ad3a5e320c87f64d8b84dd28968ec2ae630c4e46ac90b3c86b69c0" Apr 24 22:00:25.576935 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.576917 2581 scope.go:117] "RemoveContainer" containerID="b7d7539fedffdb7e2d95bff584c573068a6a31451838ab370cf648d127b29448" Apr 24 22:00:25.577001 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.576955 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl"] Apr 24 22:00:25.577204 ip-10-0-139-15 kubenswrapper[2581]: E0424 22:00:25.577185 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7d7539fedffdb7e2d95bff584c573068a6a31451838ab370cf648d127b29448\": container with ID starting with b7d7539fedffdb7e2d95bff584c573068a6a31451838ab370cf648d127b29448 not found: ID does not exist" containerID="b7d7539fedffdb7e2d95bff584c573068a6a31451838ab370cf648d127b29448" Apr 24 22:00:25.577261 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.577209 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d7539fedffdb7e2d95bff584c573068a6a31451838ab370cf648d127b29448"} err="failed to get container status \"b7d7539fedffdb7e2d95bff584c573068a6a31451838ab370cf648d127b29448\": rpc error: code = NotFound desc = could not find container \"b7d7539fedffdb7e2d95bff584c573068a6a31451838ab370cf648d127b29448\": container with ID starting with b7d7539fedffdb7e2d95bff584c573068a6a31451838ab370cf648d127b29448 not found: ID does not exist" Apr 24 22:00:25.577261 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.577227 2581 scope.go:117] "RemoveContainer" containerID="3553d47c00c274e9116daac9ade7ddf512c301d1c206550221015ac9f3e458eb" Apr 24 22:00:25.577504 ip-10-0-139-15 kubenswrapper[2581]: E0424 22:00:25.577474 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3553d47c00c274e9116daac9ade7ddf512c301d1c206550221015ac9f3e458eb\": container with ID starting with 3553d47c00c274e9116daac9ade7ddf512c301d1c206550221015ac9f3e458eb not found: ID does not exist" containerID="3553d47c00c274e9116daac9ade7ddf512c301d1c206550221015ac9f3e458eb" Apr 24 22:00:25.577622 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.577509 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3553d47c00c274e9116daac9ade7ddf512c301d1c206550221015ac9f3e458eb"} err="failed to get container status \"3553d47c00c274e9116daac9ade7ddf512c301d1c206550221015ac9f3e458eb\": rpc error: code = NotFound desc = could not find container \"3553d47c00c274e9116daac9ade7ddf512c301d1c206550221015ac9f3e458eb\": container with ID starting with 3553d47c00c274e9116daac9ade7ddf512c301d1c206550221015ac9f3e458eb not found: ID does not exist" Apr 24 22:00:25.577622 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.577531 2581 scope.go:117] "RemoveContainer" containerID="1dc0c97ca6ad3a5e320c87f64d8b84dd28968ec2ae630c4e46ac90b3c86b69c0" Apr 24 22:00:25.577796 ip-10-0-139-15 kubenswrapper[2581]: E0424 22:00:25.577780 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dc0c97ca6ad3a5e320c87f64d8b84dd28968ec2ae630c4e46ac90b3c86b69c0\": container with ID starting with 1dc0c97ca6ad3a5e320c87f64d8b84dd28968ec2ae630c4e46ac90b3c86b69c0 not found: ID does not exist" containerID="1dc0c97ca6ad3a5e320c87f64d8b84dd28968ec2ae630c4e46ac90b3c86b69c0" Apr 24 22:00:25.577852 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.577809 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dc0c97ca6ad3a5e320c87f64d8b84dd28968ec2ae630c4e46ac90b3c86b69c0"} err="failed to get container status \"1dc0c97ca6ad3a5e320c87f64d8b84dd28968ec2ae630c4e46ac90b3c86b69c0\": rpc error: code = NotFound desc = could not find container \"1dc0c97ca6ad3a5e320c87f64d8b84dd28968ec2ae630c4e46ac90b3c86b69c0\": container with ID starting with 1dc0c97ca6ad3a5e320c87f64d8b84dd28968ec2ae630c4e46ac90b3c86b69c0 not found: ID does not exist" Apr 24 22:00:25.581050 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.581029 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6748976872ldwl"] Apr 24 22:00:25.874156 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:25.874120 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcad4a5b-bda3-4020-8a36-88141b649074" path="/var/lib/kubelet/pods/dcad4a5b-bda3-4020-8a36-88141b649074/volumes" Apr 24 22:00:30.117499 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:30.117466 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx"] Apr 24 22:00:30.118057 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:30.117776 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" podUID="fdc2520c-5e80-4345-bd52-dc4457adbad8" containerName="main" containerID="cri-o://f258635c2d08504ba96553de5cf07a6e5344297cc04844b3ade5d1bcdd22a585" gracePeriod=30 Apr 24 22:00:30.118057 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:30.117810 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" podUID="fdc2520c-5e80-4345-bd52-dc4457adbad8" containerName="tokenizer" containerID="cri-o://2d1b9b1258816004bb66aee7acb6d3d8292b6bcb2ee2d32c8d3d1cfd79929240" gracePeriod=30 Apr 24 22:00:30.577875 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:30.577841 2581 generic.go:358] "Generic (PLEG): container finished" podID="fdc2520c-5e80-4345-bd52-dc4457adbad8" containerID="f258635c2d08504ba96553de5cf07a6e5344297cc04844b3ade5d1bcdd22a585" exitCode=0 Apr 24 22:00:30.578039 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:30.577889 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" event={"ID":"fdc2520c-5e80-4345-bd52-dc4457adbad8","Type":"ContainerDied","Data":"f258635c2d08504ba96553de5cf07a6e5344297cc04844b3ade5d1bcdd22a585"} Apr 24 22:00:31.266253 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.266225 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 22:00:31.320191 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.320160 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-tokenizer-uds\") pod \"fdc2520c-5e80-4345-bd52-dc4457adbad8\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " Apr 24 22:00:31.320373 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.320210 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fdc2520c-5e80-4345-bd52-dc4457adbad8-tls-certs\") pod \"fdc2520c-5e80-4345-bd52-dc4457adbad8\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " Apr 24 22:00:31.320373 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.320234 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-tokenizer-cache\") pod \"fdc2520c-5e80-4345-bd52-dc4457adbad8\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " Apr 24 22:00:31.320373 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.320274 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5bj9\" (UniqueName: \"kubernetes.io/projected/fdc2520c-5e80-4345-bd52-dc4457adbad8-kube-api-access-g5bj9\") pod \"fdc2520c-5e80-4345-bd52-dc4457adbad8\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " Apr 24 22:00:31.320373 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.320308 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-kserve-provision-location\") pod \"fdc2520c-5e80-4345-bd52-dc4457adbad8\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " Apr 24 22:00:31.320373 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.320336 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-tokenizer-tmp\") pod \"fdc2520c-5e80-4345-bd52-dc4457adbad8\" (UID: \"fdc2520c-5e80-4345-bd52-dc4457adbad8\") " Apr 24 22:00:31.320650 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.320486 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "fdc2520c-5e80-4345-bd52-dc4457adbad8" (UID: "fdc2520c-5e80-4345-bd52-dc4457adbad8"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:31.320650 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.320597 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "fdc2520c-5e80-4345-bd52-dc4457adbad8" (UID: "fdc2520c-5e80-4345-bd52-dc4457adbad8"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:31.320650 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.320625 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-tokenizer-uds\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 22:00:31.320758 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.320696 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "fdc2520c-5e80-4345-bd52-dc4457adbad8" (UID: "fdc2520c-5e80-4345-bd52-dc4457adbad8"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:31.321035 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.321011 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fdc2520c-5e80-4345-bd52-dc4457adbad8" (UID: "fdc2520c-5e80-4345-bd52-dc4457adbad8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:00:31.322434 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.322401 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdc2520c-5e80-4345-bd52-dc4457adbad8-kube-api-access-g5bj9" (OuterVolumeSpecName: "kube-api-access-g5bj9") pod "fdc2520c-5e80-4345-bd52-dc4457adbad8" (UID: "fdc2520c-5e80-4345-bd52-dc4457adbad8"). InnerVolumeSpecName "kube-api-access-g5bj9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:00:31.322528 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.322415 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdc2520c-5e80-4345-bd52-dc4457adbad8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "fdc2520c-5e80-4345-bd52-dc4457adbad8" (UID: "fdc2520c-5e80-4345-bd52-dc4457adbad8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:00:31.406948 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.406879 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kjgcf/must-gather-trfp2"] Apr 24 22:00:31.407244 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.407231 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dcad4a5b-bda3-4020-8a36-88141b649074" containerName="storage-initializer" Apr 24 22:00:31.407289 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.407245 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcad4a5b-bda3-4020-8a36-88141b649074" containerName="storage-initializer" Apr 24 22:00:31.407289 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.407268 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fdc2520c-5e80-4345-bd52-dc4457adbad8" containerName="main" Apr 24 22:00:31.407289 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.407274 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc2520c-5e80-4345-bd52-dc4457adbad8" containerName="main" Apr 24 22:00:31.407289 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.407283 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dcad4a5b-bda3-4020-8a36-88141b649074" containerName="tokenizer" Apr 24 22:00:31.407289 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.407288 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcad4a5b-bda3-4020-8a36-88141b649074" containerName="tokenizer" Apr 24 22:00:31.407482 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.407299 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dcad4a5b-bda3-4020-8a36-88141b649074" containerName="main" Apr 24 22:00:31.407482 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.407305 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcad4a5b-bda3-4020-8a36-88141b649074" containerName="main" Apr 24 22:00:31.407482 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.407312 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fdc2520c-5e80-4345-bd52-dc4457adbad8" containerName="storage-initializer" Apr 24 22:00:31.407482 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.407317 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc2520c-5e80-4345-bd52-dc4457adbad8" containerName="storage-initializer" Apr 24 22:00:31.407482 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.407324 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fdc2520c-5e80-4345-bd52-dc4457adbad8" containerName="tokenizer" Apr 24 22:00:31.407482 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.407330 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc2520c-5e80-4345-bd52-dc4457adbad8" containerName="tokenizer" Apr 24 22:00:31.407482 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.407394 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="dcad4a5b-bda3-4020-8a36-88141b649074" containerName="tokenizer" Apr 24 22:00:31.407482 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.407404 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="dcad4a5b-bda3-4020-8a36-88141b649074" containerName="main" Apr 24 22:00:31.407482 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.407410 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="fdc2520c-5e80-4345-bd52-dc4457adbad8" containerName="tokenizer" Apr 24 22:00:31.407482 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.407435 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="fdc2520c-5e80-4345-bd52-dc4457adbad8" containerName="main" Apr 24 22:00:31.411617 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.411601 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjgcf/must-gather-trfp2" Apr 24 22:00:31.414131 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.414098 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kjgcf\"/\"openshift-service-ca.crt\"" Apr 24 22:00:31.414255 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.414130 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kjgcf\"/\"kube-root-ca.crt\"" Apr 24 22:00:31.414255 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.414195 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kjgcf\"/\"default-dockercfg-7cdbh\"" Apr 24 22:00:31.420717 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.420693 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kjgcf/must-gather-trfp2"] Apr 24 22:00:31.421052 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.421029 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7zqn\" (UniqueName: \"kubernetes.io/projected/203909aa-ca57-44c0-a03f-0f9de54a3e08-kube-api-access-v7zqn\") pod \"must-gather-trfp2\" (UID: \"203909aa-ca57-44c0-a03f-0f9de54a3e08\") " pod="openshift-must-gather-kjgcf/must-gather-trfp2" Apr 24 22:00:31.421150 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.421132 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/203909aa-ca57-44c0-a03f-0f9de54a3e08-must-gather-output\") pod \"must-gather-trfp2\" (UID: \"203909aa-ca57-44c0-a03f-0f9de54a3e08\") " pod="openshift-must-gather-kjgcf/must-gather-trfp2" Apr 24 22:00:31.421210 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.421197 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fdc2520c-5e80-4345-bd52-dc4457adbad8-tls-certs\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 22:00:31.421259 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.421214 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-tokenizer-cache\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 22:00:31.421259 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.421223 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g5bj9\" (UniqueName: \"kubernetes.io/projected/fdc2520c-5e80-4345-bd52-dc4457adbad8-kube-api-access-g5bj9\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 22:00:31.421259 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.421232 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-kserve-provision-location\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 22:00:31.421259 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.421242 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fdc2520c-5e80-4345-bd52-dc4457adbad8-tokenizer-tmp\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 22:00:31.521925 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.521902 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/203909aa-ca57-44c0-a03f-0f9de54a3e08-must-gather-output\") pod \"must-gather-trfp2\" (UID: \"203909aa-ca57-44c0-a03f-0f9de54a3e08\") " pod="openshift-must-gather-kjgcf/must-gather-trfp2" Apr 24 22:00:31.522072 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.521955 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7zqn\" (UniqueName: \"kubernetes.io/projected/203909aa-ca57-44c0-a03f-0f9de54a3e08-kube-api-access-v7zqn\") pod \"must-gather-trfp2\" (UID: \"203909aa-ca57-44c0-a03f-0f9de54a3e08\") " pod="openshift-must-gather-kjgcf/must-gather-trfp2" Apr 24 22:00:31.522223 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.522205 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/203909aa-ca57-44c0-a03f-0f9de54a3e08-must-gather-output\") pod \"must-gather-trfp2\" (UID: \"203909aa-ca57-44c0-a03f-0f9de54a3e08\") " pod="openshift-must-gather-kjgcf/must-gather-trfp2" Apr 24 22:00:31.529382 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.529361 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7zqn\" (UniqueName: \"kubernetes.io/projected/203909aa-ca57-44c0-a03f-0f9de54a3e08-kube-api-access-v7zqn\") pod \"must-gather-trfp2\" (UID: \"203909aa-ca57-44c0-a03f-0f9de54a3e08\") " pod="openshift-must-gather-kjgcf/must-gather-trfp2" Apr 24 22:00:31.583213 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.583188 2581 generic.go:358] "Generic (PLEG): container finished" podID="fdc2520c-5e80-4345-bd52-dc4457adbad8" containerID="2d1b9b1258816004bb66aee7acb6d3d8292b6bcb2ee2d32c8d3d1cfd79929240" exitCode=0 Apr 24 22:00:31.583322 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.583222 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" event={"ID":"fdc2520c-5e80-4345-bd52-dc4457adbad8","Type":"ContainerDied","Data":"2d1b9b1258816004bb66aee7acb6d3d8292b6bcb2ee2d32c8d3d1cfd79929240"} Apr 24 22:00:31.583322 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.583246 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" event={"ID":"fdc2520c-5e80-4345-bd52-dc4457adbad8","Type":"ContainerDied","Data":"5f42b7bbaecdd0b4f40e9999125ce132e9375e23fbceee524ebda5c27cd7a5eb"} Apr 24 22:00:31.583322 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.583255 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx" Apr 24 22:00:31.583322 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.583266 2581 scope.go:117] "RemoveContainer" containerID="2d1b9b1258816004bb66aee7acb6d3d8292b6bcb2ee2d32c8d3d1cfd79929240" Apr 24 22:00:31.591484 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.591465 2581 scope.go:117] "RemoveContainer" containerID="f258635c2d08504ba96553de5cf07a6e5344297cc04844b3ade5d1bcdd22a585" Apr 24 22:00:31.598994 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.598978 2581 scope.go:117] "RemoveContainer" containerID="7d7fc360e40f62162fcf01dd6313b11875e08db7fa949a4a6c91392d60e33370" Apr 24 22:00:31.606283 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.606218 2581 scope.go:117] "RemoveContainer" containerID="2d1b9b1258816004bb66aee7acb6d3d8292b6bcb2ee2d32c8d3d1cfd79929240" Apr 24 22:00:31.606940 ip-10-0-139-15 kubenswrapper[2581]: E0424 22:00:31.606911 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d1b9b1258816004bb66aee7acb6d3d8292b6bcb2ee2d32c8d3d1cfd79929240\": container with ID starting with 2d1b9b1258816004bb66aee7acb6d3d8292b6bcb2ee2d32c8d3d1cfd79929240 not found: ID does not exist" containerID="2d1b9b1258816004bb66aee7acb6d3d8292b6bcb2ee2d32c8d3d1cfd79929240" Apr 24 22:00:31.607045 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.606947 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d1b9b1258816004bb66aee7acb6d3d8292b6bcb2ee2d32c8d3d1cfd79929240"} err="failed to get container status \"2d1b9b1258816004bb66aee7acb6d3d8292b6bcb2ee2d32c8d3d1cfd79929240\": rpc error: code = NotFound desc = could not find container \"2d1b9b1258816004bb66aee7acb6d3d8292b6bcb2ee2d32c8d3d1cfd79929240\": container with ID starting with 2d1b9b1258816004bb66aee7acb6d3d8292b6bcb2ee2d32c8d3d1cfd79929240 not found: ID does not exist" Apr 24 22:00:31.607045 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.606971 2581 scope.go:117] "RemoveContainer" containerID="f258635c2d08504ba96553de5cf07a6e5344297cc04844b3ade5d1bcdd22a585" Apr 24 22:00:31.607269 ip-10-0-139-15 kubenswrapper[2581]: E0424 22:00:31.607254 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f258635c2d08504ba96553de5cf07a6e5344297cc04844b3ade5d1bcdd22a585\": container with ID starting with f258635c2d08504ba96553de5cf07a6e5344297cc04844b3ade5d1bcdd22a585 not found: ID does not exist" containerID="f258635c2d08504ba96553de5cf07a6e5344297cc04844b3ade5d1bcdd22a585" Apr 24 22:00:31.607335 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.607285 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f258635c2d08504ba96553de5cf07a6e5344297cc04844b3ade5d1bcdd22a585"} err="failed to get container status \"f258635c2d08504ba96553de5cf07a6e5344297cc04844b3ade5d1bcdd22a585\": rpc error: code = NotFound desc = could not find container \"f258635c2d08504ba96553de5cf07a6e5344297cc04844b3ade5d1bcdd22a585\": container with ID starting with f258635c2d08504ba96553de5cf07a6e5344297cc04844b3ade5d1bcdd22a585 not found: ID does not exist" Apr 24 22:00:31.607335 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.607308 2581 scope.go:117] "RemoveContainer" containerID="7d7fc360e40f62162fcf01dd6313b11875e08db7fa949a4a6c91392d60e33370" Apr 24 22:00:31.607606 ip-10-0-139-15 kubenswrapper[2581]: E0424 22:00:31.607582 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d7fc360e40f62162fcf01dd6313b11875e08db7fa949a4a6c91392d60e33370\": container with ID starting with 7d7fc360e40f62162fcf01dd6313b11875e08db7fa949a4a6c91392d60e33370 not found: ID does not exist" containerID="7d7fc360e40f62162fcf01dd6313b11875e08db7fa949a4a6c91392d60e33370" Apr 24 22:00:31.607686 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.607612 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d7fc360e40f62162fcf01dd6313b11875e08db7fa949a4a6c91392d60e33370"} err="failed to get container status \"7d7fc360e40f62162fcf01dd6313b11875e08db7fa949a4a6c91392d60e33370\": rpc error: code = NotFound desc = could not find container \"7d7fc360e40f62162fcf01dd6313b11875e08db7fa949a4a6c91392d60e33370\": container with ID starting with 7d7fc360e40f62162fcf01dd6313b11875e08db7fa949a4a6c91392d60e33370 not found: ID does not exist" Apr 24 22:00:31.608532 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.608515 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx"] Apr 24 22:00:31.612584 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.612562 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedr4cx"] Apr 24 22:00:31.721685 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.721637 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjgcf/must-gather-trfp2" Apr 24 22:00:31.847002 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.846982 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kjgcf/must-gather-trfp2"] Apr 24 22:00:31.849223 ip-10-0-139-15 kubenswrapper[2581]: W0424 22:00:31.849199 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod203909aa_ca57_44c0_a03f_0f9de54a3e08.slice/crio-b03f286f598d3407a447528d0d79b38dd1eba305affa044f85603bc7a34cb19f WatchSource:0}: Error finding container b03f286f598d3407a447528d0d79b38dd1eba305affa044f85603bc7a34cb19f: Status 404 returned error can't find the container with id b03f286f598d3407a447528d0d79b38dd1eba305affa044f85603bc7a34cb19f Apr 24 22:00:31.874394 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:31.874370 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdc2520c-5e80-4345-bd52-dc4457adbad8" path="/var/lib/kubelet/pods/fdc2520c-5e80-4345-bd52-dc4457adbad8/volumes" Apr 24 22:00:32.593914 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:32.593837 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjgcf/must-gather-trfp2" event={"ID":"203909aa-ca57-44c0-a03f-0f9de54a3e08","Type":"ContainerStarted","Data":"b03f286f598d3407a447528d0d79b38dd1eba305affa044f85603bc7a34cb19f"} Apr 24 22:00:36.613101 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:36.613066 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjgcf/must-gather-trfp2" event={"ID":"203909aa-ca57-44c0-a03f-0f9de54a3e08","Type":"ContainerStarted","Data":"a1fa092020e761effea5363c8236ba82dae1ea44bcb448c51e00bc25a7952cff"} Apr 24 22:00:37.619787 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:37.619749 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjgcf/must-gather-trfp2" event={"ID":"203909aa-ca57-44c0-a03f-0f9de54a3e08","Type":"ContainerStarted","Data":"a01edf0d3f09879ffccbb21b5fb73d752b07b432eccde59afe68b80038c83c83"} Apr 24 22:00:37.636848 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:00:37.636799 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kjgcf/must-gather-trfp2" podStartSLOduration=2.050743218 podStartE2EDuration="6.636785121s" podCreationTimestamp="2026-04-24 22:00:31 +0000 UTC" firstStartedPulling="2026-04-24 22:00:31.850860727 +0000 UTC m=+2672.529697376" lastFinishedPulling="2026-04-24 22:00:36.436902628 +0000 UTC m=+2677.115739279" observedRunningTime="2026-04-24 22:00:37.634703867 +0000 UTC m=+2678.313540540" watchObservedRunningTime="2026-04-24 22:00:37.636785121 +0000 UTC m=+2678.315621821" Apr 24 22:01:00.002025 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:00.001938 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/1.log" Apr 24 22:01:00.005737 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:00.005715 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/1.log" Apr 24 22:01:00.010042 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:00.010016 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 22:01:00.013273 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:00.013255 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 22:01:00.148148 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:00.148116 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-rgd49_293aaa20-5463-4bbc-be4b-fa5c379dae0c/discovery/0.log" Apr 24 22:01:00.175043 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:00.174976 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5fdf56dbd-s82zg_543220ca-e10a-465a-a5ee-a24026536361/router/0.log" Apr 24 22:01:00.951601 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:00.951565 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-rgd49_293aaa20-5463-4bbc-be4b-fa5c379dae0c/discovery/0.log" Apr 24 22:01:00.982108 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:00.982083 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5fdf56dbd-s82zg_543220ca-e10a-465a-a5ee-a24026536361/router/0.log" Apr 24 22:01:01.828728 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:01.828697 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-cshcc_f6dcc8a9-8a3d-435a-8db9-70fc2b938bf5/manager/0.log" Apr 24 22:01:01.854302 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:01.854267 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-9lv9j_8c7e8e7c-0861-4331-996f-ff602b50ef8b/manager/0.log" Apr 24 22:01:02.712440 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:02.712398 2581 generic.go:358] "Generic (PLEG): container finished" podID="203909aa-ca57-44c0-a03f-0f9de54a3e08" containerID="a1fa092020e761effea5363c8236ba82dae1ea44bcb448c51e00bc25a7952cff" exitCode=0 Apr 24 22:01:02.712685 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:02.712476 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kjgcf/must-gather-trfp2" event={"ID":"203909aa-ca57-44c0-a03f-0f9de54a3e08","Type":"ContainerDied","Data":"a1fa092020e761effea5363c8236ba82dae1ea44bcb448c51e00bc25a7952cff"} Apr 24 22:01:02.712873 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:02.712857 2581 scope.go:117] "RemoveContainer" containerID="a1fa092020e761effea5363c8236ba82dae1ea44bcb448c51e00bc25a7952cff" Apr 24 22:01:03.532925 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:03.532893 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kjgcf_must-gather-trfp2_203909aa-ca57-44c0-a03f-0f9de54a3e08/gather/0.log" Apr 24 22:01:06.982754 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:06.982727 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-nbsck_a9f02e2a-ddce-4e3f-aec7-6bad8a006165/global-pull-secret-syncer/0.log" Apr 24 22:01:07.087945 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:07.087920 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-kgjxk_aff945d8-1222-4254-8b87-2cd6e5517284/konnectivity-agent/0.log" Apr 24 22:01:07.207332 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:07.207306 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-15.ec2.internal_099fd95062106834f37953dba57d38ac/haproxy/0.log" Apr 24 22:01:08.994265 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:08.994228 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kjgcf/must-gather-trfp2"] Apr 24 22:01:08.994677 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:08.994478 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-kjgcf/must-gather-trfp2" podUID="203909aa-ca57-44c0-a03f-0f9de54a3e08" containerName="copy" containerID="cri-o://a01edf0d3f09879ffccbb21b5fb73d752b07b432eccde59afe68b80038c83c83" gracePeriod=2 Apr 24 22:01:08.996862 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:08.996810 2581 status_manager.go:895] "Failed to get status for pod" podUID="203909aa-ca57-44c0-a03f-0f9de54a3e08" pod="openshift-must-gather-kjgcf/must-gather-trfp2" err="pods \"must-gather-trfp2\" is forbidden: User \"system:node:ip-10-0-139-15.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kjgcf\": no relationship found between node 'ip-10-0-139-15.ec2.internal' and this object" Apr 24 22:01:08.997797 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:08.997775 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kjgcf/must-gather-trfp2"] Apr 24 22:01:09.228835 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:09.228813 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kjgcf_must-gather-trfp2_203909aa-ca57-44c0-a03f-0f9de54a3e08/copy/0.log" Apr 24 22:01:09.229183 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:09.229164 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjgcf/must-gather-trfp2" Apr 24 22:01:09.231152 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:09.231126 2581 status_manager.go:895] "Failed to get status for pod" podUID="203909aa-ca57-44c0-a03f-0f9de54a3e08" pod="openshift-must-gather-kjgcf/must-gather-trfp2" err="pods \"must-gather-trfp2\" is forbidden: User \"system:node:ip-10-0-139-15.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kjgcf\": no relationship found between node 'ip-10-0-139-15.ec2.internal' and this object" Apr 24 22:01:09.352686 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:09.352616 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/203909aa-ca57-44c0-a03f-0f9de54a3e08-must-gather-output\") pod \"203909aa-ca57-44c0-a03f-0f9de54a3e08\" (UID: \"203909aa-ca57-44c0-a03f-0f9de54a3e08\") " Apr 24 22:01:09.352811 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:09.352686 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7zqn\" (UniqueName: \"kubernetes.io/projected/203909aa-ca57-44c0-a03f-0f9de54a3e08-kube-api-access-v7zqn\") pod \"203909aa-ca57-44c0-a03f-0f9de54a3e08\" (UID: \"203909aa-ca57-44c0-a03f-0f9de54a3e08\") " Apr 24 22:01:09.354771 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:09.354750 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/203909aa-ca57-44c0-a03f-0f9de54a3e08-kube-api-access-v7zqn" (OuterVolumeSpecName: "kube-api-access-v7zqn") pod "203909aa-ca57-44c0-a03f-0f9de54a3e08" (UID: "203909aa-ca57-44c0-a03f-0f9de54a3e08"). InnerVolumeSpecName "kube-api-access-v7zqn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:01:09.359185 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:09.359160 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/203909aa-ca57-44c0-a03f-0f9de54a3e08-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "203909aa-ca57-44c0-a03f-0f9de54a3e08" (UID: "203909aa-ca57-44c0-a03f-0f9de54a3e08"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:01:09.454063 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:09.454035 2581 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/203909aa-ca57-44c0-a03f-0f9de54a3e08-must-gather-output\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 22:01:09.454063 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:09.454058 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v7zqn\" (UniqueName: \"kubernetes.io/projected/203909aa-ca57-44c0-a03f-0f9de54a3e08-kube-api-access-v7zqn\") on node \"ip-10-0-139-15.ec2.internal\" DevicePath \"\"" Apr 24 22:01:09.742360 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:09.742332 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kjgcf_must-gather-trfp2_203909aa-ca57-44c0-a03f-0f9de54a3e08/copy/0.log" Apr 24 22:01:09.742683 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:09.742659 2581 generic.go:358] "Generic (PLEG): container finished" podID="203909aa-ca57-44c0-a03f-0f9de54a3e08" containerID="a01edf0d3f09879ffccbb21b5fb73d752b07b432eccde59afe68b80038c83c83" exitCode=143 Apr 24 22:01:09.742787 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:09.742709 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kjgcf/must-gather-trfp2" Apr 24 22:01:09.742787 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:09.742758 2581 scope.go:117] "RemoveContainer" containerID="a01edf0d3f09879ffccbb21b5fb73d752b07b432eccde59afe68b80038c83c83" Apr 24 22:01:09.744639 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:09.744615 2581 status_manager.go:895] "Failed to get status for pod" podUID="203909aa-ca57-44c0-a03f-0f9de54a3e08" pod="openshift-must-gather-kjgcf/must-gather-trfp2" err="pods \"must-gather-trfp2\" is forbidden: User \"system:node:ip-10-0-139-15.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kjgcf\": no relationship found between node 'ip-10-0-139-15.ec2.internal' and this object" Apr 24 22:01:09.751267 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:09.751241 2581 scope.go:117] "RemoveContainer" containerID="a1fa092020e761effea5363c8236ba82dae1ea44bcb448c51e00bc25a7952cff" Apr 24 22:01:09.753126 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:09.753100 2581 status_manager.go:895] "Failed to get status for pod" podUID="203909aa-ca57-44c0-a03f-0f9de54a3e08" pod="openshift-must-gather-kjgcf/must-gather-trfp2" err="pods \"must-gather-trfp2\" is forbidden: User \"system:node:ip-10-0-139-15.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kjgcf\": no relationship found between node 'ip-10-0-139-15.ec2.internal' and this object" Apr 24 22:01:09.764205 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:09.764186 2581 scope.go:117] "RemoveContainer" containerID="a01edf0d3f09879ffccbb21b5fb73d752b07b432eccde59afe68b80038c83c83" Apr 24 22:01:09.764488 ip-10-0-139-15 kubenswrapper[2581]: E0424 22:01:09.764469 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a01edf0d3f09879ffccbb21b5fb73d752b07b432eccde59afe68b80038c83c83\": container with ID starting with a01edf0d3f09879ffccbb21b5fb73d752b07b432eccde59afe68b80038c83c83 not found: ID does not exist" containerID="a01edf0d3f09879ffccbb21b5fb73d752b07b432eccde59afe68b80038c83c83" Apr 24 22:01:09.764559 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:09.764495 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a01edf0d3f09879ffccbb21b5fb73d752b07b432eccde59afe68b80038c83c83"} err="failed to get container status \"a01edf0d3f09879ffccbb21b5fb73d752b07b432eccde59afe68b80038c83c83\": rpc error: code = NotFound desc = could not find container \"a01edf0d3f09879ffccbb21b5fb73d752b07b432eccde59afe68b80038c83c83\": container with ID starting with a01edf0d3f09879ffccbb21b5fb73d752b07b432eccde59afe68b80038c83c83 not found: ID does not exist" Apr 24 22:01:09.764559 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:09.764510 2581 scope.go:117] "RemoveContainer" containerID="a1fa092020e761effea5363c8236ba82dae1ea44bcb448c51e00bc25a7952cff" Apr 24 22:01:09.764711 ip-10-0-139-15 kubenswrapper[2581]: E0424 22:01:09.764694 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1fa092020e761effea5363c8236ba82dae1ea44bcb448c51e00bc25a7952cff\": container with ID starting with a1fa092020e761effea5363c8236ba82dae1ea44bcb448c51e00bc25a7952cff not found: ID does not exist" containerID="a1fa092020e761effea5363c8236ba82dae1ea44bcb448c51e00bc25a7952cff" Apr 24 22:01:09.764753 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:09.764717 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1fa092020e761effea5363c8236ba82dae1ea44bcb448c51e00bc25a7952cff"} err="failed to get container status \"a1fa092020e761effea5363c8236ba82dae1ea44bcb448c51e00bc25a7952cff\": rpc error: code = NotFound desc = could not find container \"a1fa092020e761effea5363c8236ba82dae1ea44bcb448c51e00bc25a7952cff\": container with ID starting with a1fa092020e761effea5363c8236ba82dae1ea44bcb448c51e00bc25a7952cff not found: ID does not exist" Apr 24 22:01:09.872543 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:09.872516 2581 status_manager.go:895] "Failed to get status for pod" podUID="203909aa-ca57-44c0-a03f-0f9de54a3e08" pod="openshift-must-gather-kjgcf/must-gather-trfp2" err="pods \"must-gather-trfp2\" is forbidden: User \"system:node:ip-10-0-139-15.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kjgcf\": no relationship found between node 'ip-10-0-139-15.ec2.internal' and this object" Apr 24 22:01:09.873871 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:09.873851 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="203909aa-ca57-44c0-a03f-0f9de54a3e08" path="/var/lib/kubelet/pods/203909aa-ca57-44c0-a03f-0f9de54a3e08/volumes" Apr 24 22:01:11.143606 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:11.143577 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-cshcc_f6dcc8a9-8a3d-435a-8db9-70fc2b938bf5/manager/0.log" Apr 24 22:01:11.204601 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:11.204575 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-9lv9j_8c7e8e7c-0861-4331-996f-ff602b50ef8b/manager/0.log" Apr 24 22:01:12.294280 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:12.294252 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-lhgp5_a4c82719-9c98-4a75-864d-75fb12509cb1/cluster-monitoring-operator/0.log" Apr 24 22:01:12.496649 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:12.496621 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-grgjm_44be5a66-af42-4898-93c0-df61266a91dd/node-exporter/0.log" Apr 24 22:01:12.518655 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:12.518635 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-grgjm_44be5a66-af42-4898-93c0-df61266a91dd/kube-rbac-proxy/0.log" Apr 24 22:01:12.541217 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:12.541196 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-grgjm_44be5a66-af42-4898-93c0-df61266a91dd/init-textfile/0.log" Apr 24 22:01:12.881441 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:12.881392 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-v7997_de8a23e4-07a9-437d-baed-a6d60d4f5485/prometheus-operator/0.log" Apr 24 22:01:12.905936 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:12.905908 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-v7997_de8a23e4-07a9-437d-baed-a6d60d4f5485/kube-rbac-proxy/0.log" Apr 24 22:01:12.937313 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:12.937287 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-fthv5_05babe0a-65a7-4208-bdf7-46e2e3b914e9/prometheus-operator-admission-webhook/0.log" Apr 24 22:01:14.386890 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:14.386861 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-499sv_680befb0-2e56-4df6-b7ca-58caea84d887/networking-console-plugin/0.log" Apr 24 22:01:14.901368 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:14.901337 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/1.log" Apr 24 22:01:14.905746 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:14.905719 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c7lrn_e187095c-23db-4e09-af90-8e136f238cec/console-operator/2.log" Apr 24 22:01:15.421544 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:15.421517 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-mlhtc_6eacc4cf-b65c-4484-bacb-a5c0e01cefac/download-server/0.log" Apr 24 22:01:15.879149 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:15.879121 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr"] Apr 24 22:01:15.879479 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:15.879466 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="203909aa-ca57-44c0-a03f-0f9de54a3e08" containerName="gather" Apr 24 22:01:15.879524 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:15.879481 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="203909aa-ca57-44c0-a03f-0f9de54a3e08" containerName="gather" Apr 24 22:01:15.879524 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:15.879490 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="203909aa-ca57-44c0-a03f-0f9de54a3e08" containerName="copy" Apr 24 22:01:15.879524 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:15.879495 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="203909aa-ca57-44c0-a03f-0f9de54a3e08" containerName="copy" Apr 24 22:01:15.879625 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:15.879542 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="203909aa-ca57-44c0-a03f-0f9de54a3e08" containerName="gather" Apr 24 22:01:15.879625 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:15.879551 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="203909aa-ca57-44c0-a03f-0f9de54a3e08" containerName="copy" Apr 24 22:01:15.882660 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:15.882636 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr" Apr 24 22:01:15.884775 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:15.884756 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xhf46\"/\"kube-root-ca.crt\"" Apr 24 22:01:15.885607 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:15.885589 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xhf46\"/\"default-dockercfg-p8nxl\"" Apr 24 22:01:15.885730 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:15.885629 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xhf46\"/\"openshift-service-ca.crt\"" Apr 24 22:01:15.891740 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:15.891720 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr"] Apr 24 22:01:15.912565 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:15.912531 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-fvtdb_2cd953b8-a92d-4621-a038-746bab77ff9f/volume-data-source-validator/0.log" Apr 24 22:01:16.011975 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.011946 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2c6a95e7-66b1-4eae-863b-50fad8dca57f-proc\") pod \"perf-node-gather-daemonset-pfpxr\" (UID: \"2c6a95e7-66b1-4eae-863b-50fad8dca57f\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr" Apr 24 22:01:16.012131 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.011992 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2c6a95e7-66b1-4eae-863b-50fad8dca57f-podres\") pod \"perf-node-gather-daemonset-pfpxr\" (UID: \"2c6a95e7-66b1-4eae-863b-50fad8dca57f\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr" Apr 24 22:01:16.012131 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.012012 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c6a95e7-66b1-4eae-863b-50fad8dca57f-sys\") pod \"perf-node-gather-daemonset-pfpxr\" (UID: \"2c6a95e7-66b1-4eae-863b-50fad8dca57f\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr" Apr 24 22:01:16.012131 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.012101 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vnlm\" (UniqueName: \"kubernetes.io/projected/2c6a95e7-66b1-4eae-863b-50fad8dca57f-kube-api-access-5vnlm\") pod \"perf-node-gather-daemonset-pfpxr\" (UID: \"2c6a95e7-66b1-4eae-863b-50fad8dca57f\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr" Apr 24 22:01:16.012258 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.012146 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2c6a95e7-66b1-4eae-863b-50fad8dca57f-lib-modules\") pod \"perf-node-gather-daemonset-pfpxr\" (UID: \"2c6a95e7-66b1-4eae-863b-50fad8dca57f\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr" Apr 24 22:01:16.113021 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.112991 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2c6a95e7-66b1-4eae-863b-50fad8dca57f-proc\") pod \"perf-node-gather-daemonset-pfpxr\" (UID: \"2c6a95e7-66b1-4eae-863b-50fad8dca57f\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr" Apr 24 22:01:16.113185 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.113027 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2c6a95e7-66b1-4eae-863b-50fad8dca57f-podres\") pod \"perf-node-gather-daemonset-pfpxr\" (UID: \"2c6a95e7-66b1-4eae-863b-50fad8dca57f\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr" Apr 24 22:01:16.113185 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.113048 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c6a95e7-66b1-4eae-863b-50fad8dca57f-sys\") pod \"perf-node-gather-daemonset-pfpxr\" (UID: \"2c6a95e7-66b1-4eae-863b-50fad8dca57f\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr" Apr 24 22:01:16.113185 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.113071 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vnlm\" (UniqueName: \"kubernetes.io/projected/2c6a95e7-66b1-4eae-863b-50fad8dca57f-kube-api-access-5vnlm\") pod \"perf-node-gather-daemonset-pfpxr\" (UID: \"2c6a95e7-66b1-4eae-863b-50fad8dca57f\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr" Apr 24 22:01:16.113185 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.113115 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2c6a95e7-66b1-4eae-863b-50fad8dca57f-proc\") pod \"perf-node-gather-daemonset-pfpxr\" (UID: \"2c6a95e7-66b1-4eae-863b-50fad8dca57f\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr" Apr 24 22:01:16.113185 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.113132 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c6a95e7-66b1-4eae-863b-50fad8dca57f-sys\") pod \"perf-node-gather-daemonset-pfpxr\" (UID: \"2c6a95e7-66b1-4eae-863b-50fad8dca57f\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr" Apr 24 22:01:16.113185 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.113147 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2c6a95e7-66b1-4eae-863b-50fad8dca57f-podres\") pod \"perf-node-gather-daemonset-pfpxr\" (UID: \"2c6a95e7-66b1-4eae-863b-50fad8dca57f\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr" Apr 24 22:01:16.113185 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.113167 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2c6a95e7-66b1-4eae-863b-50fad8dca57f-lib-modules\") pod \"perf-node-gather-daemonset-pfpxr\" (UID: \"2c6a95e7-66b1-4eae-863b-50fad8dca57f\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr" Apr 24 22:01:16.113457 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.113267 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2c6a95e7-66b1-4eae-863b-50fad8dca57f-lib-modules\") pod \"perf-node-gather-daemonset-pfpxr\" (UID: \"2c6a95e7-66b1-4eae-863b-50fad8dca57f\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr" Apr 24 22:01:16.120396 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.120375 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vnlm\" (UniqueName: \"kubernetes.io/projected/2c6a95e7-66b1-4eae-863b-50fad8dca57f-kube-api-access-5vnlm\") pod \"perf-node-gather-daemonset-pfpxr\" (UID: \"2c6a95e7-66b1-4eae-863b-50fad8dca57f\") " pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr" Apr 24 22:01:16.194264 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.194243 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr" Apr 24 22:01:16.320450 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.320405 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr"] Apr 24 22:01:16.322206 ip-10-0-139-15 kubenswrapper[2581]: W0424 22:01:16.322178 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2c6a95e7_66b1_4eae_863b_50fad8dca57f.slice/crio-e704c85377e51d15df7a5ffad134f7d476168ffd017b5bc24f6770186d9ee74b WatchSource:0}: Error finding container e704c85377e51d15df7a5ffad134f7d476168ffd017b5bc24f6770186d9ee74b: Status 404 returned error can't find the container with id e704c85377e51d15df7a5ffad134f7d476168ffd017b5bc24f6770186d9ee74b Apr 24 22:01:16.323820 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.323800 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:01:16.691050 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.691022 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-w76jt_d79fb269-2ec1-4e09-a8d6-9b2a367d21b6/dns/0.log" Apr 24 22:01:16.711922 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.711900 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-w76jt_d79fb269-2ec1-4e09-a8d6-9b2a367d21b6/kube-rbac-proxy/0.log" Apr 24 22:01:16.769491 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.769464 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr" event={"ID":"2c6a95e7-66b1-4eae-863b-50fad8dca57f","Type":"ContainerStarted","Data":"5a23550caa54d4ca5412a550e1239c1225ebcf1dea9ce489936d63bd014ad2da"} Apr 24 22:01:16.769631 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.769497 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr" event={"ID":"2c6a95e7-66b1-4eae-863b-50fad8dca57f","Type":"ContainerStarted","Data":"e704c85377e51d15df7a5ffad134f7d476168ffd017b5bc24f6770186d9ee74b"} Apr 24 22:01:16.769631 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.769556 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr" Apr 24 22:01:16.774483 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.774462 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5pvk7_92521ad1-7ba9-4bdd-bc3b-f470cd17cfef/dns-node-resolver/0.log" Apr 24 22:01:16.785218 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:16.785177 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr" podStartSLOduration=1.785161214 podStartE2EDuration="1.785161214s" podCreationTimestamp="2026-04-24 22:01:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:01:16.784576841 +0000 UTC m=+2717.463413510" watchObservedRunningTime="2026-04-24 22:01:16.785161214 +0000 UTC m=+2717.463997887" Apr 24 22:01:17.339511 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:17.339484 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nwpfl_801b2a6e-b16d-4e63-b007-af7d6c9273f5/node-ca/0.log" Apr 24 22:01:18.108642 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:18.108616 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-rgd49_293aaa20-5463-4bbc-be4b-fa5c379dae0c/discovery/0.log" Apr 24 22:01:18.156467 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:18.156417 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5fdf56dbd-s82zg_543220ca-e10a-465a-a5ee-a24026536361/router/0.log" Apr 24 22:01:18.634504 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:18.634475 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-pxf27_aefd5dcc-7f58-4fab-8028-2cffcff95339/serve-healthcheck-canary/0.log" Apr 24 22:01:19.010362 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:19.010327 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-xfzqb_d1b7bcd1-e58f-42c3-9a78-a06df4ff2253/insights-operator/0.log" Apr 24 22:01:19.010991 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:19.010972 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-xfzqb_d1b7bcd1-e58f-42c3-9a78-a06df4ff2253/insights-operator/1.log" Apr 24 22:01:19.031340 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:19.031316 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2mqlt_ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9/kube-rbac-proxy/0.log" Apr 24 22:01:19.051885 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:19.051867 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2mqlt_ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9/exporter/0.log" Apr 24 22:01:19.072407 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:19.072386 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2mqlt_ec5c2f4e-e1d8-48e1-bcc0-bf72afefb8e9/extractor/0.log" Apr 24 22:01:22.198342 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:22.198307 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-74fc8f6f96-bfwqh_0c576436-6187-4ab4-8a5e-798cd5bd02c9/manager/0.log" Apr 24 22:01:22.269442 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:22.269401 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-znlgd_0589c56e-7987-421c-82d1-0b565b112246/server/0.log" Apr 24 22:01:22.784764 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:22.784737 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xhf46/perf-node-gather-daemonset-pfpxr" Apr 24 22:01:27.091035 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:27.090985 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-swmml_d68a0f58-d2e4-4a3f-a00d-2554fcccb09c/migrator/0.log" Apr 24 22:01:27.109759 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:27.109732 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-swmml_d68a0f58-d2e4-4a3f-a00d-2554fcccb09c/graceful-termination/0.log" Apr 24 22:01:27.481027 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:27.480994 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-z962g_cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45/kube-storage-version-migrator-operator/1.log" Apr 24 22:01:27.481877 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:27.481857 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-z962g_cdd306f5-b3ab-47c4-ac0d-ba9ba28c5e45/kube-storage-version-migrator-operator/0.log" Apr 24 22:01:28.441357 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:28.441329 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9rv78_ff16a19a-c677-4d51-81d3-8d67d7ce1749/kube-multus/0.log" Apr 24 22:01:28.464416 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:28.464394 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5shvq_c61fee18-e272-4bf5-aa08-65392bba68b6/kube-multus-additional-cni-plugins/0.log" Apr 24 22:01:28.483867 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:28.483847 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5shvq_c61fee18-e272-4bf5-aa08-65392bba68b6/egress-router-binary-copy/0.log" Apr 24 22:01:28.504874 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:28.504855 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5shvq_c61fee18-e272-4bf5-aa08-65392bba68b6/cni-plugins/0.log" Apr 24 22:01:28.524962 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:28.524937 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5shvq_c61fee18-e272-4bf5-aa08-65392bba68b6/bond-cni-plugin/0.log" Apr 24 22:01:28.544917 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:28.544899 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5shvq_c61fee18-e272-4bf5-aa08-65392bba68b6/routeoverride-cni/0.log" Apr 24 22:01:28.566049 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:28.566028 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5shvq_c61fee18-e272-4bf5-aa08-65392bba68b6/whereabouts-cni-bincopy/0.log" Apr 24 22:01:28.584035 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:28.584014 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5shvq_c61fee18-e272-4bf5-aa08-65392bba68b6/whereabouts-cni/0.log" Apr 24 22:01:29.002531 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:29.002480 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-n487x_657a2c9b-4e75-4d61-bff2-d8abdd05825d/network-metrics-daemon/0.log" Apr 24 22:01:29.022152 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:29.022049 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-n487x_657a2c9b-4e75-4d61-bff2-d8abdd05825d/kube-rbac-proxy/0.log" Apr 24 22:01:30.621807 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:30.621775 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-controller/0.log" Apr 24 22:01:30.641557 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:30.641528 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/0.log" Apr 24 22:01:30.653147 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:30.653093 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovn-acl-logging/1.log" Apr 24 22:01:30.670893 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:30.670867 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/kube-rbac-proxy-node/0.log" Apr 24 22:01:30.691321 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:30.691291 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:01:30.714355 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:30.714301 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/northd/0.log" Apr 24 22:01:30.734386 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:30.734369 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/nbdb/0.log" Apr 24 22:01:30.754918 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:30.754897 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/sbdb/0.log" Apr 24 22:01:30.854364 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:30.854334 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qnlsv_56d7cab8-8a8d-47a6-81da-f1f67f4aed59/ovnkube-controller/0.log" Apr 24 22:01:31.874990 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:31.874913 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-kc9q6_76435f4f-785c-4dce-912c-13fbc131a04a/check-endpoints/0.log" Apr 24 22:01:31.916831 ip-10-0-139-15 kubenswrapper[2581]: I0424 22:01:31.916801 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-5shjj_9018a4db-1967-45ae-8ad3-7fdd04d6a4d1/network-check-target-container/0.log"