Apr 23 17:55:49.964423 ip-10-0-143-63 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 17:55:49.964434 ip-10-0-143-63 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 17:55:49.964441 ip-10-0-143-63 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 17:55:49.964690 ip-10-0-143-63 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 17:56:00.016545 ip-10-0-143-63 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 17:56:00.016577 ip-10-0-143-63 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot f3539c4141ca497c8e3cbdf668cb8352 -- Apr 23 17:58:35.632853 ip-10-0-143-63 systemd[1]: Starting Kubernetes Kubelet... Apr 23 17:58:36.105046 ip-10-0-143-63 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:58:36.105046 ip-10-0-143-63 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 17:58:36.105046 ip-10-0-143-63 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:58:36.105046 ip-10-0-143-63 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 17:58:36.105046 ip-10-0-143-63 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:58:36.106053 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.105964 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 17:58:36.108177 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108162 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:58:36.108177 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108177 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:58:36.108238 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108181 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:58:36.108238 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108184 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:58:36.108238 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108188 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:58:36.108238 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108191 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:58:36.108238 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108194 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:58:36.108238 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108196 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:58:36.108238 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108199 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:58:36.108238 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108202 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:58:36.108238 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108205 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:58:36.108238 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108207 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:58:36.108238 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108210 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:58:36.108238 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108212 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:58:36.108238 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108215 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:58:36.108238 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108217 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:58:36.108238 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108220 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:58:36.108238 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108224 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:58:36.108238 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108228 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:58:36.108238 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108231 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:58:36.108238 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108234 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:58:36.108689 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108237 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:58:36.108689 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108239 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:58:36.108689 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108242 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:58:36.108689 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108252 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:58:36.108689 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108255 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:58:36.108689 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108258 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:58:36.108689 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108260 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:58:36.108689 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108263 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:58:36.108689 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108265 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:58:36.108689 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108268 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:58:36.108689 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108270 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:58:36.108689 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108273 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:58:36.108689 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108275 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:58:36.108689 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108278 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:58:36.108689 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108280 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:58:36.108689 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108283 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:58:36.108689 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108285 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:58:36.108689 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108288 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:58:36.108689 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108290 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:58:36.108689 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108293 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:58:36.109265 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108295 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:58:36.109265 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108298 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:58:36.109265 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108301 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:58:36.109265 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108303 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:58:36.109265 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108306 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:58:36.109265 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108308 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:58:36.109265 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108311 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:58:36.109265 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108313 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:58:36.109265 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108316 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:58:36.109265 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108318 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:58:36.109265 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108320 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:58:36.109265 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108323 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:58:36.109265 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108325 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:58:36.109265 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108328 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:58:36.109265 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108331 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:58:36.109265 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108334 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:58:36.109265 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108336 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:58:36.109265 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108339 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:58:36.109265 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108341 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:58:36.109265 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108344 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:58:36.109779 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108347 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:58:36.109779 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108351 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:58:36.109779 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108354 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:58:36.109779 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108357 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:58:36.109779 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108360 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:58:36.109779 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108363 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:58:36.109779 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108366 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:58:36.109779 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108368 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:58:36.109779 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108371 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:58:36.109779 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108374 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:58:36.109779 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108377 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:58:36.109779 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108379 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:58:36.109779 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108382 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:58:36.109779 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108384 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:58:36.109779 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108386 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:58:36.109779 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108389 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:58:36.109779 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108391 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:58:36.109779 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108394 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:58:36.109779 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108396 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:58:36.110229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108398 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:58:36.110229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108401 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:58:36.110229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108403 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:58:36.110229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108405 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:58:36.110229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108408 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:58:36.110229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108411 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:58:36.110229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108788 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:58:36.110229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108793 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:58:36.110229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108796 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:58:36.110229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108799 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:58:36.110229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108801 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:58:36.110229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108804 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:58:36.110229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108807 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:58:36.110229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108809 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:58:36.110229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108812 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:58:36.110229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108814 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:58:36.110229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108817 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:58:36.110229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108819 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:58:36.110229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108822 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:58:36.110229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108825 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:58:36.110713 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108827 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:58:36.110713 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108829 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:58:36.110713 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108832 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:58:36.110713 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108835 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:58:36.110713 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108838 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:58:36.110713 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108840 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:58:36.110713 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108842 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:58:36.110713 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108845 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:58:36.110713 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108847 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:58:36.110713 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108850 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:58:36.110713 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108852 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:58:36.110713 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108854 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:58:36.110713 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108857 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:58:36.110713 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108859 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:58:36.110713 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108862 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:58:36.110713 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108864 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:58:36.110713 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108866 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:58:36.110713 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108869 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:58:36.110713 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108872 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:58:36.110713 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108875 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:58:36.111229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108877 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:58:36.111229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108879 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:58:36.111229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108882 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:58:36.111229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108886 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:58:36.111229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108888 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:58:36.111229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108890 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:58:36.111229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108894 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:58:36.111229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108897 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:58:36.111229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108900 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:58:36.111229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108902 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:58:36.111229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108905 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:58:36.111229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108907 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:58:36.111229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108909 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:58:36.111229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108912 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:58:36.111229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108916 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:58:36.111229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108920 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:58:36.111229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108923 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:58:36.111229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108926 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:58:36.111229 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108928 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:58:36.111731 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108931 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:58:36.111731 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108933 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:58:36.111731 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108935 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:58:36.111731 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108938 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:58:36.111731 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108940 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:58:36.111731 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108943 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:58:36.111731 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108945 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:58:36.111731 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108948 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:58:36.111731 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108950 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:58:36.111731 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108952 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:58:36.111731 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108955 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:58:36.111731 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108958 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:58:36.111731 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108961 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:58:36.111731 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108963 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:58:36.111731 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108966 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:58:36.111731 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108968 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:58:36.111731 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108971 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:58:36.111731 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108973 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:58:36.111731 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108975 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:58:36.112191 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108978 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:58:36.112191 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108981 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:58:36.112191 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108983 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:58:36.112191 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108986 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:58:36.112191 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108989 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:58:36.112191 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108991 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:58:36.112191 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108994 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:58:36.112191 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108996 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:58:36.112191 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.108999 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:58:36.112191 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109001 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:58:36.112191 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109004 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:58:36.112191 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109006 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:58:36.112191 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109009 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:58:36.112191 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109011 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:58:36.112191 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109083 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 17:58:36.112191 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109090 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 17:58:36.112191 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109097 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 17:58:36.112191 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109102 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 17:58:36.112191 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109106 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 17:58:36.112191 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109110 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 17:58:36.112191 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109114 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109119 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109122 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109126 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109130 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109133 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109136 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109139 2578 flags.go:64] FLAG: --cgroup-root="" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109142 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109145 2578 flags.go:64] FLAG: --client-ca-file="" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109148 2578 flags.go:64] FLAG: --cloud-config="" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109151 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109153 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109158 2578 flags.go:64] FLAG: --cluster-domain="" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109161 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109163 2578 flags.go:64] FLAG: --config-dir="" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109166 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109169 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109177 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109181 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109184 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109187 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109190 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109193 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 17:58:36.112705 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109196 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109200 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109202 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109206 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109209 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109212 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109215 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109218 2578 flags.go:64] FLAG: --enable-server="true" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109221 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109230 2578 flags.go:64] FLAG: --event-burst="100" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109233 2578 flags.go:64] FLAG: --event-qps="50" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109236 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109239 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109242 2578 flags.go:64] FLAG: --eviction-hard="" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109246 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109249 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109252 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109255 2578 flags.go:64] FLAG: --eviction-soft="" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109258 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109261 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109264 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109267 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109270 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109273 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109275 2578 flags.go:64] FLAG: --feature-gates="" Apr 23 17:58:36.113276 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109279 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109282 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109285 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109288 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109291 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109294 2578 flags.go:64] FLAG: --help="false" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109297 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-143-63.ec2.internal" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109308 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109311 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109314 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109317 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109320 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109323 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109326 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109329 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109332 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109335 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109338 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109342 2578 flags.go:64] FLAG: --kube-reserved="" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109345 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109347 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109350 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109353 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109356 2578 flags.go:64] FLAG: --lock-file="" Apr 23 17:58:36.113893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109358 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109361 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109364 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109369 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109372 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109374 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109377 2578 flags.go:64] FLAG: --logging-format="text" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109380 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109383 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109386 2578 flags.go:64] FLAG: --manifest-url="" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109388 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109392 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109395 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109399 2578 flags.go:64] FLAG: --max-pods="110" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109402 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109405 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109413 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109416 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109421 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109424 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109427 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109434 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109437 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109440 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 17:58:36.114445 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109443 2578 flags.go:64] FLAG: --pod-cidr="" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109446 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109452 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109455 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109458 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109461 2578 flags.go:64] FLAG: --port="10250" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109463 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109466 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e513dd05b3d9de6d" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109469 2578 flags.go:64] FLAG: --qos-reserved="" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109472 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109475 2578 flags.go:64] FLAG: --register-node="true" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109478 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109481 2578 flags.go:64] FLAG: --register-with-taints="" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109484 2578 flags.go:64] FLAG: --registry-burst="10" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109487 2578 flags.go:64] FLAG: --registry-qps="5" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109490 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109492 2578 flags.go:64] FLAG: --reserved-memory="" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109496 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109499 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109502 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109504 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109507 2578 flags.go:64] FLAG: --runonce="false" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109510 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109513 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109516 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 23 17:58:36.115112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109524 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109529 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109544 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109547 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109550 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109554 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109557 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109560 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109563 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109566 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109569 2578 flags.go:64] FLAG: --system-cgroups="" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109572 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109577 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109580 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109583 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109589 2578 flags.go:64] FLAG: --tls-min-version="" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109592 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109595 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109598 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109601 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109604 2578 flags.go:64] FLAG: --v="2" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109608 2578 flags.go:64] FLAG: --version="false" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109611 2578 flags.go:64] FLAG: --vmodule="" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109616 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109619 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 17:58:36.115740 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109707 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:58:36.116332 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109710 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:58:36.116332 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109714 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:58:36.116332 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109717 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:58:36.116332 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109719 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:58:36.116332 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109722 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:58:36.116332 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109725 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:58:36.116332 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109727 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:58:36.116332 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109735 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:58:36.116332 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109738 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:58:36.116332 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109741 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:58:36.116332 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109743 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:58:36.116332 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109746 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:58:36.116332 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109749 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:58:36.116332 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109751 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:58:36.116332 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109754 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:58:36.116332 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109756 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:58:36.116332 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109759 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:58:36.116332 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109762 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:58:36.116332 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109764 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:58:36.116332 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109767 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:58:36.116847 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109769 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:58:36.116847 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109772 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:58:36.116847 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109774 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:58:36.116847 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109777 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:58:36.116847 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109780 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:58:36.116847 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109782 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:58:36.116847 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109785 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:58:36.116847 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109788 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:58:36.116847 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109790 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:58:36.116847 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109793 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:58:36.116847 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109795 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:58:36.116847 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109798 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:58:36.116847 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109800 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:58:36.116847 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109803 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:58:36.116847 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109805 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:58:36.116847 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109808 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:58:36.116847 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109810 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:58:36.116847 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109812 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:58:36.116847 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109815 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:58:36.117302 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109818 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:58:36.117302 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109821 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:58:36.117302 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109824 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:58:36.117302 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109827 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:58:36.117302 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109829 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:58:36.117302 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109832 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:58:36.117302 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109834 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:58:36.117302 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109837 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:58:36.117302 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109839 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:58:36.117302 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109841 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:58:36.117302 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109844 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:58:36.117302 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109846 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:58:36.117302 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109849 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:58:36.117302 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109852 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:58:36.117302 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109854 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:58:36.117302 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109857 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:58:36.117302 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109859 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:58:36.117302 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109862 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:58:36.117302 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109866 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:58:36.117797 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109869 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:58:36.117797 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109872 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:58:36.117797 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109875 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:58:36.117797 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109878 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:58:36.117797 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109880 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:58:36.117797 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109883 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:58:36.117797 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109886 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:58:36.117797 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109888 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:58:36.117797 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109891 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:58:36.117797 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109893 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:58:36.117797 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109896 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:58:36.117797 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109898 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:58:36.117797 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109900 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:58:36.117797 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109904 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:58:36.117797 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109906 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:58:36.117797 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109909 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:58:36.117797 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109912 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:58:36.117797 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109915 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:58:36.117797 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109917 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:58:36.117797 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109920 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:58:36.118268 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109922 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:58:36.118268 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109924 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:58:36.118268 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109927 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:58:36.118268 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109929 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:58:36.118268 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109932 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:58:36.118268 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109935 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:58:36.118268 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.109939 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:58:36.118268 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.109947 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:58:36.118268 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.117145 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 17:58:36.118268 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.117161 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 17:58:36.118268 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117210 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:58:36.118268 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117216 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:58:36.118268 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117219 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:58:36.118268 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117222 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:58:36.118268 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117226 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:58:36.118667 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117228 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:58:36.118667 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117231 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:58:36.118667 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117234 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:58:36.118667 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117236 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:58:36.118667 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117240 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:58:36.118667 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117244 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:58:36.118667 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117247 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:58:36.118667 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117249 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:58:36.118667 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117252 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:58:36.118667 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117255 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:58:36.118667 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117257 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:58:36.118667 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117260 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:58:36.118667 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117263 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:58:36.118667 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117266 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:58:36.118667 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117268 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:58:36.118667 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117271 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:58:36.118667 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117273 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:58:36.118667 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117275 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:58:36.118667 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117278 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:58:36.118667 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117280 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:58:36.119152 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117283 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:58:36.119152 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117285 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:58:36.119152 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117288 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:58:36.119152 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117290 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:58:36.119152 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117293 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:58:36.119152 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117295 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:58:36.119152 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117299 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:58:36.119152 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117302 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:58:36.119152 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117305 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:58:36.119152 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117308 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:58:36.119152 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117311 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:58:36.119152 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117314 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:58:36.119152 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117316 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:58:36.119152 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117319 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:58:36.119152 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117323 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:58:36.119152 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117325 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:58:36.119152 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117328 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:58:36.119152 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117330 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:58:36.119152 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117333 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:58:36.119152 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117335 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:58:36.119655 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117338 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:58:36.119655 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117340 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:58:36.119655 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117343 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:58:36.119655 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117345 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:58:36.119655 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117348 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:58:36.119655 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117353 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:58:36.119655 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117356 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:58:36.119655 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117358 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:58:36.119655 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117361 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:58:36.119655 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117363 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:58:36.119655 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117366 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:58:36.119655 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117368 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:58:36.119655 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117371 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:58:36.119655 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117373 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:58:36.119655 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117376 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:58:36.119655 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117378 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:58:36.119655 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117381 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:58:36.119655 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117383 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:58:36.119655 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117386 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:58:36.119655 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117389 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:58:36.120126 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117391 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:58:36.120126 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117394 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:58:36.120126 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117397 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:58:36.120126 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117399 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:58:36.120126 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117402 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:58:36.120126 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117404 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:58:36.120126 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117406 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:58:36.120126 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117409 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:58:36.120126 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117411 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:58:36.120126 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117414 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:58:36.120126 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117416 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:58:36.120126 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117420 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:58:36.120126 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117424 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:58:36.120126 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117427 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:58:36.120126 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117430 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:58:36.120126 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117433 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:58:36.120126 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117436 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:58:36.120126 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117438 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:58:36.120126 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117441 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:58:36.120657 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117444 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:58:36.120657 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117446 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:58:36.120657 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.117452 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:58:36.120657 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117559 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:58:36.120657 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117563 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:58:36.120657 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117566 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:58:36.120657 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117569 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:58:36.120657 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117572 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:58:36.120657 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117575 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:58:36.120657 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117578 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:58:36.120657 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117580 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:58:36.120657 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117583 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:58:36.120657 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117586 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:58:36.120657 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117588 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:58:36.120657 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117591 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:58:36.120657 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117593 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:58:36.121079 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117596 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:58:36.121079 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117598 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:58:36.121079 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117600 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:58:36.121079 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117604 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:58:36.121079 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117608 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:58:36.121079 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117610 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:58:36.121079 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117613 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:58:36.121079 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117616 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:58:36.121079 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117618 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:58:36.121079 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117621 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:58:36.121079 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117623 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:58:36.121079 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117626 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:58:36.121079 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117628 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:58:36.121079 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117631 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:58:36.121079 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117633 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:58:36.121079 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117635 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:58:36.121079 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117638 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:58:36.121079 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117641 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:58:36.121079 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117644 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:58:36.121516 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117646 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:58:36.121516 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117649 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:58:36.121516 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117651 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:58:36.121516 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117654 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:58:36.121516 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117656 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:58:36.121516 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117659 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:58:36.121516 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117661 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:58:36.121516 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117664 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:58:36.121516 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117667 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:58:36.121516 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117669 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:58:36.121516 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117672 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:58:36.121516 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117674 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:58:36.121516 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117677 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:58:36.121516 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117679 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:58:36.121516 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117682 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:58:36.121516 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117685 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:58:36.121516 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117687 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:58:36.121516 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117689 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:58:36.121516 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117692 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:58:36.121516 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117694 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:58:36.122013 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117697 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:58:36.122013 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117700 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:58:36.122013 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117702 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:58:36.122013 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117705 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:58:36.122013 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117707 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:58:36.122013 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117710 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:58:36.122013 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117712 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:58:36.122013 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117716 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:58:36.122013 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117719 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:58:36.122013 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117722 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:58:36.122013 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117726 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:58:36.122013 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117728 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:58:36.122013 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117731 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:58:36.122013 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117734 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:58:36.122013 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117737 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:58:36.122013 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117740 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:58:36.122013 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117742 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:58:36.122013 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117745 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:58:36.122013 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117748 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:58:36.122013 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117751 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:58:36.122480 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117754 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:58:36.122480 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117757 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:58:36.122480 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117759 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:58:36.122480 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117762 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:58:36.122480 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117765 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:58:36.122480 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117768 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:58:36.122480 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117770 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:58:36.122480 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117773 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:58:36.122480 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117775 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:58:36.122480 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117778 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:58:36.122480 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117780 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:58:36.122480 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117783 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:58:36.122480 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117785 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:58:36.122480 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:36.117787 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:58:36.122480 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.117793 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:58:36.122851 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.117893 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 17:58:36.122851 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.120696 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 17:58:36.122851 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.121632 2578 server.go:1019] "Starting client certificate rotation" Apr 23 17:58:36.122851 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.121736 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:58:36.122851 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.122492 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:58:36.148946 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.148932 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:58:36.151476 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.151461 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:58:36.168771 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.168751 2578 log.go:25] "Validated CRI v1 runtime API" Apr 23 17:58:36.176423 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.176404 2578 log.go:25] "Validated CRI v1 image API" Apr 23 17:58:36.177662 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.177646 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 17:58:36.178368 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.178351 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:58:36.180101 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.180077 2578 fs.go:135] Filesystem UUIDs: map[550b63b0-35ca-4187-b4f4-a84400a2a398:/dev/nvme0n1p4 6d6ee900-fe77-4954-8fcd-24e6c349094b:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 23 17:58:36.180180 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.180096 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 17:58:36.185799 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.185688 2578 manager.go:217] Machine: {Timestamp:2026-04-23 17:58:36.18360903 +0000 UTC m=+0.417350936 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3199525 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec274348babf4510d3d74f6ee666dd72 SystemUUID:ec274348-babf-4510-d3d7-4f6ee666dd72 BootID:f3539c41-41ca-497c-8e3c-bdf668cb8352 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:3d:68:7f:ec:31 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:3d:68:7f:ec:31 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5e:a1:f0:6b:e4:ae Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 17:58:36.185799 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.185794 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 17:58:36.185915 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.185903 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 17:58:36.188724 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.188700 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 17:58:36.188878 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.188726 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-63.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 17:58:36.188963 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.188893 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 17:58:36.188963 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.188906 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 17:58:36.188963 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.188923 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:58:36.190038 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.190026 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:58:36.190944 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.190932 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:58:36.191070 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.191060 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 17:58:36.193546 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.193524 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 23 17:58:36.193602 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.193564 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 17:58:36.193602 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.193586 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 17:58:36.193602 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.193599 2578 kubelet.go:397] "Adding apiserver pod source" Apr 23 17:58:36.193718 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.193611 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 17:58:36.194691 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.194677 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:58:36.194762 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.194699 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:58:36.197473 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.197455 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 17:58:36.198247 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.198231 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-cc7dl" Apr 23 17:58:36.201351 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.201326 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 17:58:36.202872 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.202852 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 17:58:36.202952 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.202880 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 17:58:36.202952 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.202887 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 17:58:36.202952 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.202892 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 17:58:36.202952 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.202900 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 17:58:36.202952 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.202906 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 17:58:36.202952 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.202912 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 17:58:36.202952 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.202918 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 17:58:36.202952 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.202926 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 17:58:36.202952 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.202935 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 17:58:36.202952 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.202944 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 17:58:36.202952 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.202952 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 17:58:36.203926 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.203909 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 17:58:36.203959 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.203942 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 17:58:36.206081 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.205897 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-cc7dl" Apr 23 17:58:36.206156 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.206024 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-63.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:58:36.206156 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:36.206056 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 17:58:36.206227 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:36.206162 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-63.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 17:58:36.207984 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.207972 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 17:58:36.208026 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.208005 2578 server.go:1295] "Started kubelet" Apr 23 17:58:36.208112 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.208088 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 17:58:36.208185 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.208132 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 17:58:36.208255 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.208243 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 17:58:36.208778 ip-10-0-143-63 systemd[1]: Started Kubernetes Kubelet. Apr 23 17:58:36.209229 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.209211 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 17:58:36.210807 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.210792 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 23 17:58:36.216945 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.216929 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 17:58:36.217501 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.217490 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 17:58:36.217753 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:36.217723 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 17:58:36.218244 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.218228 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 17:58:36.218297 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.218231 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 17:58:36.218297 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.218258 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 17:58:36.218402 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.218355 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 23 17:58:36.218402 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.218365 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 23 17:58:36.219353 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:36.218563 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-63.ec2.internal\" not found" Apr 23 17:58:36.219353 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.218791 2578 factory.go:153] Registering CRI-O factory Apr 23 17:58:36.219353 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.218839 2578 factory.go:223] Registration of the crio container factory successfully Apr 23 17:58:36.219353 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.218901 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 17:58:36.219353 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.218910 2578 factory.go:55] Registering systemd factory Apr 23 17:58:36.219353 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.218917 2578 factory.go:223] Registration of the systemd container factory successfully Apr 23 17:58:36.219353 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.218940 2578 factory.go:103] Registering Raw factory Apr 23 17:58:36.219353 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.218949 2578 manager.go:1196] Started watching for new ooms in manager Apr 23 17:58:36.219838 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.219408 2578 manager.go:319] Starting recovery of all containers Apr 23 17:58:36.220392 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.220369 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:58:36.223181 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:36.223156 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-143-63.ec2.internal\" not found" node="ip-10-0-143-63.ec2.internal" Apr 23 17:58:36.229698 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.229681 2578 manager.go:324] Recovery completed Apr 23 17:58:36.234778 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.234761 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:58:36.237255 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.237239 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-63.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:58:36.237320 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.237267 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:58:36.237320 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.237279 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-63.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:58:36.237728 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.237712 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 17:58:36.237787 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.237728 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 17:58:36.237787 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.237768 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:58:36.240272 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.240259 2578 policy_none.go:49] "None policy: Start" Apr 23 17:58:36.240320 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.240276 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 17:58:36.240320 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.240286 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 23 17:58:36.264056 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.264042 2578 manager.go:341] "Starting Device Plugin manager" Apr 23 17:58:36.264144 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:36.264074 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 17:58:36.264144 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.264087 2578 server.go:85] "Starting device plugin registration server" Apr 23 17:58:36.264307 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.264296 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 17:58:36.264358 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.264307 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 17:58:36.264434 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.264414 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 17:58:36.264516 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.264504 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 17:58:36.264516 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.264516 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 17:58:36.265023 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:36.264952 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 17:58:36.265023 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:36.265000 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-63.ec2.internal\" not found" Apr 23 17:58:36.351589 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.351561 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 17:58:36.352727 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.352680 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 17:58:36.352727 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.352719 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 17:58:36.352843 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.352742 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 17:58:36.352843 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.352751 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 17:58:36.352843 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:36.352789 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 17:58:36.355163 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.355118 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:58:36.364571 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.364560 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:58:36.365572 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.365558 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-63.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:58:36.365631 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.365587 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:58:36.365631 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.365599 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-63.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:58:36.365631 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.365619 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-63.ec2.internal" Apr 23 17:58:36.373723 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.373709 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-63.ec2.internal" Apr 23 17:58:36.373772 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:36.373730 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-63.ec2.internal\": node \"ip-10-0-143-63.ec2.internal\" not found" Apr 23 17:58:36.388476 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:36.388459 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-63.ec2.internal\" not found" Apr 23 17:58:36.452974 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.452952 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-63.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-63.ec2.internal"] Apr 23 17:58:36.453039 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.453014 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:58:36.454402 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.454386 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-63.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:58:36.454485 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.454415 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:58:36.454485 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.454430 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-63.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:58:36.455642 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.455630 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:58:36.455779 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.455766 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-63.ec2.internal" Apr 23 17:58:36.455814 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.455798 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:58:36.456263 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.456244 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-63.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:58:36.456337 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.456247 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-63.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:58:36.456337 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.456302 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:58:36.456337 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.456313 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-63.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:58:36.456337 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.456278 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:58:36.456466 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.456344 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-63.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:58:36.457397 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.457384 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-63.ec2.internal" Apr 23 17:58:36.457440 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.457412 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:58:36.458050 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.458036 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-63.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:58:36.458099 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.458061 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:58:36.458099 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.458075 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-63.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:58:36.480103 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:36.480085 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-63.ec2.internal\" not found" node="ip-10-0-143-63.ec2.internal" Apr 23 17:58:36.484393 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:36.484378 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-63.ec2.internal\" not found" node="ip-10-0-143-63.ec2.internal" Apr 23 17:58:36.489275 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:36.489260 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-63.ec2.internal\" not found" Apr 23 17:58:36.520349 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.520333 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0b7732360440c87ff89761ec4d043808-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-63.ec2.internal\" (UID: \"0b7732360440c87ff89761ec4d043808\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-63.ec2.internal" Apr 23 17:58:36.520426 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.520356 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/858e9dbb3f829b7de0afbb8a36ca323c-config\") pod \"kube-apiserver-proxy-ip-10-0-143-63.ec2.internal\" (UID: \"858e9dbb3f829b7de0afbb8a36ca323c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-63.ec2.internal" Apr 23 17:58:36.520426 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.520375 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0b7732360440c87ff89761ec4d043808-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-63.ec2.internal\" (UID: \"0b7732360440c87ff89761ec4d043808\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-63.ec2.internal" Apr 23 17:58:36.589538 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:36.589519 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-63.ec2.internal\" not found" Apr 23 17:58:36.620879 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.620836 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0b7732360440c87ff89761ec4d043808-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-63.ec2.internal\" (UID: \"0b7732360440c87ff89761ec4d043808\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-63.ec2.internal" Apr 23 17:58:36.620879 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.620859 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0b7732360440c87ff89761ec4d043808-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-63.ec2.internal\" (UID: \"0b7732360440c87ff89761ec4d043808\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-63.ec2.internal" Apr 23 17:58:36.620879 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.620875 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/858e9dbb3f829b7de0afbb8a36ca323c-config\") pod \"kube-apiserver-proxy-ip-10-0-143-63.ec2.internal\" (UID: \"858e9dbb3f829b7de0afbb8a36ca323c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-63.ec2.internal" Apr 23 17:58:36.621010 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.620911 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/858e9dbb3f829b7de0afbb8a36ca323c-config\") pod \"kube-apiserver-proxy-ip-10-0-143-63.ec2.internal\" (UID: \"858e9dbb3f829b7de0afbb8a36ca323c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-63.ec2.internal" Apr 23 17:58:36.621010 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.620941 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0b7732360440c87ff89761ec4d043808-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-63.ec2.internal\" (UID: \"0b7732360440c87ff89761ec4d043808\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-63.ec2.internal" Apr 23 17:58:36.621010 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.620944 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0b7732360440c87ff89761ec4d043808-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-63.ec2.internal\" (UID: \"0b7732360440c87ff89761ec4d043808\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-63.ec2.internal" Apr 23 17:58:36.690110 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:36.690089 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-63.ec2.internal\" not found" Apr 23 17:58:36.781719 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.781690 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-63.ec2.internal" Apr 23 17:58:36.786980 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:36.786964 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-63.ec2.internal" Apr 23 17:58:36.790862 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:36.790833 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-63.ec2.internal\" not found" Apr 23 17:58:36.891437 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:36.891373 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-63.ec2.internal\" not found" Apr 23 17:58:36.991846 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:36.991819 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-63.ec2.internal\" not found" Apr 23 17:58:37.092379 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:37.092347 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-63.ec2.internal\" not found" Apr 23 17:58:37.121668 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:37.121636 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 17:58:37.122278 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:37.121779 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 17:58:37.122278 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:37.121799 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 17:58:37.193102 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:37.193073 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-63.ec2.internal\" not found" Apr 23 17:58:37.209236 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:37.209206 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 17:53:36 +0000 UTC" deadline="2027-12-06 08:22:54.703017306 +0000 UTC" Apr 23 17:58:37.209236 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:37.209232 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14198h24m17.493788298s" Apr 23 17:58:37.217522 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:37.217497 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 17:58:37.225499 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:37.225480 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:58:37.250081 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:37.250057 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-w2t8m" Apr 23 17:58:37.258267 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:37.258246 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-w2t8m" Apr 23 17:58:37.265477 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:37.265455 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:58:37.294066 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:37.294044 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-63.ec2.internal\" not found" Apr 23 17:58:37.320117 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:37.320099 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:58:37.325978 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:37.325954 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b7732360440c87ff89761ec4d043808.slice/crio-bb5e08f4c4a254f99db1eadf8b09d818a0f253180ceafc2480c394de3fbcfdf3 WatchSource:0}: Error finding container bb5e08f4c4a254f99db1eadf8b09d818a0f253180ceafc2480c394de3fbcfdf3: Status 404 returned error can't find the container with id bb5e08f4c4a254f99db1eadf8b09d818a0f253180ceafc2480c394de3fbcfdf3 Apr 23 17:58:37.330102 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:37.330085 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:58:37.355373 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:37.355330 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-63.ec2.internal" event={"ID":"0b7732360440c87ff89761ec4d043808","Type":"ContainerStarted","Data":"bb5e08f4c4a254f99db1eadf8b09d818a0f253180ceafc2480c394de3fbcfdf3"} Apr 23 17:58:37.418253 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:37.418200 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-63.ec2.internal" Apr 23 17:58:37.428027 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:37.428002 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:58:37.428905 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:37.428894 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-63.ec2.internal" Apr 23 17:58:37.438091 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:37.438076 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:58:37.493862 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:37.493699 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod858e9dbb3f829b7de0afbb8a36ca323c.slice/crio-ea94625b7f60ef8b015b547be4dacb958d2b3e6a241c7b8814f2f373fe76bc70 WatchSource:0}: Error finding container ea94625b7f60ef8b015b547be4dacb958d2b3e6a241c7b8814f2f373fe76bc70: Status 404 returned error can't find the container with id ea94625b7f60ef8b015b547be4dacb958d2b3e6a241c7b8814f2f373fe76bc70 Apr 23 17:58:38.132448 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.132416 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:58:38.194261 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.194200 2578 apiserver.go:52] "Watching apiserver" Apr 23 17:58:38.208757 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.208554 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 17:58:38.211212 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.210818 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst","openshift-cluster-node-tuning-operator/tuned-5gwc4","openshift-dns/node-resolver-hl5qq","openshift-image-registry/node-ca-l4wm6","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-63.ec2.internal","openshift-multus/multus-additional-cni-plugins-8d8vh","openshift-multus/multus-rtg5v","openshift-network-operator/iptables-alerter-tprc2","kube-system/konnectivity-agent-qcp2q","kube-system/kube-apiserver-proxy-ip-10-0-143-63.ec2.internal","openshift-multus/network-metrics-daemon-xwp2q","openshift-network-diagnostics/network-check-target-gtd8z","openshift-ovn-kubernetes/ovnkube-node-978cv"] Apr 23 17:58:38.213792 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.213758 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.217700 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.215124 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hl5qq" Apr 23 17:58:38.217700 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.216180 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 17:58:38.217700 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.216207 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 17:58:38.217700 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.216329 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 17:58:38.217700 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.216482 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 17:58:38.217700 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.216656 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mqqwn\"" Apr 23 17:58:38.217700 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.217039 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.217700 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.217322 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 17:58:38.217700 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.217399 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-5gfcv\"" Apr 23 17:58:38.217700 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.217455 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 17:58:38.219222 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.218347 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l4wm6" Apr 23 17:58:38.219222 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.219117 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-przhk\"" Apr 23 17:58:38.219222 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.219207 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:58:38.219222 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.219218 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 17:58:38.220327 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.219652 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.220327 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.220130 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 17:58:38.220327 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.220310 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 17:58:38.220587 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.220440 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wpjkn\"" Apr 23 17:58:38.220648 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.220610 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 17:58:38.221021 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.220999 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tprc2" Apr 23 17:58:38.221339 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.221318 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 17:58:38.221620 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.221602 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-2wb4q\"" Apr 23 17:58:38.221694 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.221671 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 17:58:38.222635 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.222611 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.222739 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.222694 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 17:58:38.223049 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.223030 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-zljhc\"" Apr 23 17:58:38.223189 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.223089 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 17:58:38.223261 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.223093 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:58:38.224390 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.224157 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-qcp2q" Apr 23 17:58:38.224390 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.224169 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 17:58:38.224791 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.224774 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 17:58:38.225040 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.225023 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mm2md\"" Apr 23 17:58:38.225129 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.225060 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 17:58:38.225421 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.225403 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:58:38.225504 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:38.225479 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xwp2q" podUID="c5673cab-427f-416d-a4ba-94ac7c29dc9c" Apr 23 17:58:38.226194 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.226175 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-j66km\"" Apr 23 17:58:38.226275 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.226224 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 17:58:38.226444 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.226428 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 17:58:38.226865 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.226837 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:58:38.226937 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.226876 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.226937 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:38.226912 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gtd8z" podUID="979ab58c-b655-4aab-94f9-8920472712df" Apr 23 17:58:38.228492 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.228473 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9d9582b8-817a-4d02-862f-e5bbde6a1652-multus-daemon-config\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.228599 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.228508 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-etc-sysctl-d\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.228599 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.228559 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-etc-systemd\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.228599 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.228582 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-run\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.228762 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.228646 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-sys\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.228762 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.228682 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d8d9c074-5a2a-4898-b910-f1a16ffc62fc-hosts-file\") pod \"node-resolver-hl5qq\" (UID: \"d8d9c074-5a2a-4898-b910-f1a16ffc62fc\") " pod="openshift-dns/node-resolver-hl5qq" Apr 23 17:58:38.228762 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.228707 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl9j8\" (UniqueName: \"kubernetes.io/projected/d8d9c074-5a2a-4898-b910-f1a16ffc62fc-kube-api-access-gl9j8\") pod \"node-resolver-hl5qq\" (UID: \"d8d9c074-5a2a-4898-b910-f1a16ffc62fc\") " pod="openshift-dns/node-resolver-hl5qq" Apr 23 17:58:38.228762 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.228728 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 17:58:38.228762 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.228732 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-multus-cni-dir\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.228978 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.228777 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-etc-sysctl-conf\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.228978 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.228800 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2j56\" (UniqueName: \"kubernetes.io/projected/fbe124fd-664e-4080-a965-56b14926b56f-kube-api-access-z2j56\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.228978 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.228824 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68lzd\" (UniqueName: \"kubernetes.io/projected/c04a156a-dd80-4859-a932-b0e25e9bce6b-kube-api-access-68lzd\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.228978 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.228847 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/55c8d4e4-74a0-45e8-9e7e-a49c8861570c-socket-dir\") pod \"aws-ebs-csi-driver-node-hxqst\" (UID: \"55c8d4e4-74a0-45e8-9e7e-a49c8861570c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.228978 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.228871 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/55c8d4e4-74a0-45e8-9e7e-a49c8861570c-registration-dir\") pod \"aws-ebs-csi-driver-node-hxqst\" (UID: \"55c8d4e4-74a0-45e8-9e7e-a49c8861570c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.228978 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.228894 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/55c8d4e4-74a0-45e8-9e7e-a49c8861570c-etc-selinux\") pod \"aws-ebs-csi-driver-node-hxqst\" (UID: \"55c8d4e4-74a0-45e8-9e7e-a49c8861570c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.228978 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.228916 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shtws\" (UniqueName: \"kubernetes.io/projected/55c8d4e4-74a0-45e8-9e7e-a49c8861570c-kube-api-access-shtws\") pod \"aws-ebs-csi-driver-node-hxqst\" (UID: \"55c8d4e4-74a0-45e8-9e7e-a49c8861570c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.228978 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.228925 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 17:58:38.228978 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.228942 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-host-var-lib-cni-multus\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.228978 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.228964 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-etc-modprobe-d\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.229465 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.228986 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-os-release\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.229465 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229017 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz2mp\" (UniqueName: \"kubernetes.io/projected/9d9582b8-817a-4d02-862f-e5bbde6a1652-kube-api-access-hz2mp\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.229465 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229036 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 17:58:38.229465 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229120 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 17:58:38.229465 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229039 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-etc-sysconfig\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.229465 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229170 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c04a156a-dd80-4859-a932-b0e25e9bce6b-cnibin\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.229465 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229219 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c04a156a-dd80-4859-a932-b0e25e9bce6b-os-release\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.229465 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229245 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 17:58:38.229465 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229260 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c04a156a-dd80-4859-a932-b0e25e9bce6b-cni-binary-copy\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.229465 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229301 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 17:58:38.229465 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229293 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-system-cni-dir\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.229465 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229363 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-host-var-lib-cni-bin\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.229465 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229394 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-host-var-lib-kubelet\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.229465 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229414 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-75hp2\"" Apr 23 17:58:38.229465 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229426 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-multus-conf-dir\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.229465 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229450 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-etc-kubernetes\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.230172 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229477 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22a39804-db9b-4a6b-a927-b5f0bb1d22eb-host\") pod \"node-ca-l4wm6\" (UID: \"22a39804-db9b-4a6b-a927-b5f0bb1d22eb\") " pod="openshift-image-registry/node-ca-l4wm6" Apr 23 17:58:38.230172 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229501 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/22a39804-db9b-4a6b-a927-b5f0bb1d22eb-serviceca\") pod \"node-ca-l4wm6\" (UID: \"22a39804-db9b-4a6b-a927-b5f0bb1d22eb\") " pod="openshift-image-registry/node-ca-l4wm6" Apr 23 17:58:38.230172 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229553 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klz59\" (UniqueName: \"kubernetes.io/projected/22a39804-db9b-4a6b-a927-b5f0bb1d22eb-kube-api-access-klz59\") pod \"node-ca-l4wm6\" (UID: \"22a39804-db9b-4a6b-a927-b5f0bb1d22eb\") " pod="openshift-image-registry/node-ca-l4wm6" Apr 23 17:58:38.230172 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229588 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-host\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.230172 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229615 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fbe124fd-664e-4080-a965-56b14926b56f-etc-tuned\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.230172 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229640 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c04a156a-dd80-4859-a932-b0e25e9bce6b-system-cni-dir\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.230172 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229665 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c04a156a-dd80-4859-a932-b0e25e9bce6b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.230172 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229691 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c04a156a-dd80-4859-a932-b0e25e9bce6b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.230172 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229719 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55c8d4e4-74a0-45e8-9e7e-a49c8861570c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hxqst\" (UID: \"55c8d4e4-74a0-45e8-9e7e-a49c8861570c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.230172 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229744 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/55c8d4e4-74a0-45e8-9e7e-a49c8861570c-sys-fs\") pod \"aws-ebs-csi-driver-node-hxqst\" (UID: \"55c8d4e4-74a0-45e8-9e7e-a49c8861570c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.230172 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229766 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-cnibin\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.230172 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229790 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9d9582b8-817a-4d02-862f-e5bbde6a1652-cni-binary-copy\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.230172 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229814 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-host-run-netns\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.230172 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229837 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-hostroot\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.230172 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229858 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-host-run-multus-certs\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.230172 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229884 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-var-lib-kubelet\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.230172 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229919 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fbe124fd-664e-4080-a965-56b14926b56f-tmp\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.231146 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229960 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-multus-socket-dir-parent\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.231146 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.229989 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-host-run-k8s-cni-cncf-io\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.231146 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.230019 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-etc-kubernetes\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.231146 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.230049 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-lib-modules\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.231146 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.230094 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d8d9c074-5a2a-4898-b910-f1a16ffc62fc-tmp-dir\") pod \"node-resolver-hl5qq\" (UID: \"d8d9c074-5a2a-4898-b910-f1a16ffc62fc\") " pod="openshift-dns/node-resolver-hl5qq" Apr 23 17:58:38.231146 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.230132 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c04a156a-dd80-4859-a932-b0e25e9bce6b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.231146 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.230157 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/55c8d4e4-74a0-45e8-9e7e-a49c8861570c-device-dir\") pod \"aws-ebs-csi-driver-node-hxqst\" (UID: \"55c8d4e4-74a0-45e8-9e7e-a49c8861570c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.259045 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.259013 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:53:37 +0000 UTC" deadline="2028-02-09 11:15:18.01274552 +0000 UTC" Apr 23 17:58:38.259154 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.259045 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15761h16m39.753703475s" Apr 23 17:58:38.279937 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.279917 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:58:38.319698 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.319677 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 17:58:38.330521 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.330486 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-host-run-netns\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.330638 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.330524 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-host-run-multus-certs\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.330638 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.330567 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/94dfcde8-e6d8-4b6b-825e-40bb5305f5ef-konnectivity-ca\") pod \"konnectivity-agent-qcp2q\" (UID: \"94dfcde8-e6d8-4b6b-825e-40bb5305f5ef\") " pod="kube-system/konnectivity-agent-qcp2q" Apr 23 17:58:38.330638 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.330594 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-host-run-k8s-cni-cncf-io\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.330638 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.330606 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-host-run-netns\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.330828 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.330611 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-host-run-multus-certs\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.330828 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.330616 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d8d9c074-5a2a-4898-b910-f1a16ffc62fc-tmp-dir\") pod \"node-resolver-hl5qq\" (UID: \"d8d9c074-5a2a-4898-b910-f1a16ffc62fc\") " pod="openshift-dns/node-resolver-hl5qq" Apr 23 17:58:38.330828 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.330668 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-host-run-k8s-cni-cncf-io\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.330828 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.330686 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfrp4\" (UniqueName: \"kubernetes.io/projected/c5673cab-427f-416d-a4ba-94ac7c29dc9c-kube-api-access-sfrp4\") pod \"network-metrics-daemon-xwp2q\" (UID: \"c5673cab-427f-416d-a4ba-94ac7c29dc9c\") " pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:58:38.330828 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.330710 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-systemd-units\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.330828 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.330749 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-host-cni-bin\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.330828 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.330805 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9d9582b8-817a-4d02-862f-e5bbde6a1652-multus-daemon-config\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.330828 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.330829 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-run\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.331182 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331011 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d8d9c074-5a2a-4898-b910-f1a16ffc62fc-tmp-dir\") pod \"node-resolver-hl5qq\" (UID: \"d8d9c074-5a2a-4898-b910-f1a16ffc62fc\") " pod="openshift-dns/node-resolver-hl5qq" Apr 23 17:58:38.331182 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331097 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-run\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.331182 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331127 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-sys\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.331182 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331153 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/55c8d4e4-74a0-45e8-9e7e-a49c8861570c-etc-selinux\") pod \"aws-ebs-csi-driver-node-hxqst\" (UID: \"55c8d4e4-74a0-45e8-9e7e-a49c8861570c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.331364 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331188 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shtws\" (UniqueName: \"kubernetes.io/projected/55c8d4e4-74a0-45e8-9e7e-a49c8861570c-kube-api-access-shtws\") pod \"aws-ebs-csi-driver-node-hxqst\" (UID: \"55c8d4e4-74a0-45e8-9e7e-a49c8861570c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.331364 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331214 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-log-socket\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.331364 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331241 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-multus-cni-dir\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.331364 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331268 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/55c8d4e4-74a0-45e8-9e7e-a49c8861570c-etc-selinux\") pod \"aws-ebs-csi-driver-node-hxqst\" (UID: \"55c8d4e4-74a0-45e8-9e7e-a49c8861570c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.331364 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331293 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/55c8d4e4-74a0-45e8-9e7e-a49c8861570c-socket-dir\") pod \"aws-ebs-csi-driver-node-hxqst\" (UID: \"55c8d4e4-74a0-45e8-9e7e-a49c8861570c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.331364 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331319 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/92df26fb-43f5-4d39-9c51-669235fa190e-host-slash\") pod \"iptables-alerter-tprc2\" (UID: \"92df26fb-43f5-4d39-9c51-669235fa190e\") " pod="openshift-network-operator/iptables-alerter-tprc2" Apr 23 17:58:38.331364 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331332 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-sys\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.331364 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331343 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-host-cni-netd\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.331744 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331387 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-multus-cni-dir\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.331744 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331395 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9d9582b8-817a-4d02-862f-e5bbde6a1652-multus-daemon-config\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.331744 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331433 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-host-var-lib-cni-multus\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.331744 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331459 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-host-var-lib-cni-multus\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.331744 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331463 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs\") pod \"network-metrics-daemon-xwp2q\" (UID: \"c5673cab-427f-416d-a4ba-94ac7c29dc9c\") " pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:58:38.331744 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331584 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-node-log\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.331744 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331613 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hz2mp\" (UniqueName: \"kubernetes.io/projected/9d9582b8-817a-4d02-862f-e5bbde6a1652-kube-api-access-hz2mp\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.331744 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331648 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-etc-sysconfig\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.331744 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331653 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/55c8d4e4-74a0-45e8-9e7e-a49c8861570c-socket-dir\") pod \"aws-ebs-csi-driver-node-hxqst\" (UID: \"55c8d4e4-74a0-45e8-9e7e-a49c8861570c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.331744 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331673 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c04a156a-dd80-4859-a932-b0e25e9bce6b-cnibin\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.331744 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331708 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-run-ovn\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.332159 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331799 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-etc-sysconfig\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.332159 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331838 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c04a156a-dd80-4859-a932-b0e25e9bce6b-cnibin\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.332159 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331844 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-system-cni-dir\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.332159 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331891 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-host-var-lib-cni-bin\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.332159 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331928 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-host-var-lib-kubelet\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.332159 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331936 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-system-cni-dir\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.332159 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331952 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-multus-conf-dir\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.332159 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331941 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-host-var-lib-cni-bin\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.332159 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331975 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-host-var-lib-kubelet\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.332159 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.331988 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-multus-conf-dir\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.332159 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332017 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-etc-kubernetes\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.332159 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332041 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22a39804-db9b-4a6b-a927-b5f0bb1d22eb-host\") pod \"node-ca-l4wm6\" (UID: \"22a39804-db9b-4a6b-a927-b5f0bb1d22eb\") " pod="openshift-image-registry/node-ca-l4wm6" Apr 23 17:58:38.332159 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332069 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klz59\" (UniqueName: \"kubernetes.io/projected/22a39804-db9b-4a6b-a927-b5f0bb1d22eb-kube-api-access-klz59\") pod \"node-ca-l4wm6\" (UID: \"22a39804-db9b-4a6b-a927-b5f0bb1d22eb\") " pod="openshift-image-registry/node-ca-l4wm6" Apr 23 17:58:38.332159 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332076 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-etc-kubernetes\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.332159 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332095 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55c8d4e4-74a0-45e8-9e7e-a49c8861570c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hxqst\" (UID: \"55c8d4e4-74a0-45e8-9e7e-a49c8861570c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.332159 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332118 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-host\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.332159 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332121 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22a39804-db9b-4a6b-a927-b5f0bb1d22eb-host\") pod \"node-ca-l4wm6\" (UID: \"22a39804-db9b-4a6b-a927-b5f0bb1d22eb\") " pod="openshift-image-registry/node-ca-l4wm6" Apr 23 17:58:38.332159 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332141 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fbe124fd-664e-4080-a965-56b14926b56f-etc-tuned\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.332931 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332165 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c04a156a-dd80-4859-a932-b0e25e9bce6b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.332931 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332184 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55c8d4e4-74a0-45e8-9e7e-a49c8861570c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hxqst\" (UID: \"55c8d4e4-74a0-45e8-9e7e-a49c8861570c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.332931 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332195 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ftk9\" (UniqueName: \"kubernetes.io/projected/92df26fb-43f5-4d39-9c51-669235fa190e-kube-api-access-9ftk9\") pod \"iptables-alerter-tprc2\" (UID: \"92df26fb-43f5-4d39-9c51-669235fa190e\") " pod="openshift-network-operator/iptables-alerter-tprc2" Apr 23 17:58:38.332931 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332220 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9d9582b8-817a-4d02-862f-e5bbde6a1652-cni-binary-copy\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.332931 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332244 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-hostroot\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.332931 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332267 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-var-lib-kubelet\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.332931 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332289 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fbe124fd-664e-4080-a965-56b14926b56f-tmp\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.332931 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332315 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/246e705b-d502-4b42-bea0-4b6149b86183-env-overrides\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.332931 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332350 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-hostroot\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.332931 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332363 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/246e705b-d502-4b42-bea0-4b6149b86183-ovn-node-metrics-cert\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.332931 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332376 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-var-lib-kubelet\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.332931 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332315 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-host\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.332931 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332402 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-multus-socket-dir-parent\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.332931 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332444 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-etc-kubernetes\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.332931 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332426 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 17:58:38.332931 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332480 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-lib-modules\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.332931 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332489 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-etc-kubernetes\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.332931 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332507 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c04a156a-dd80-4859-a932-b0e25e9bce6b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.333697 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332555 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/55c8d4e4-74a0-45e8-9e7e-a49c8861570c-device-dir\") pod \"aws-ebs-csi-driver-node-hxqst\" (UID: \"55c8d4e4-74a0-45e8-9e7e-a49c8861570c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.333697 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332582 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-host-slash\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.333697 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332606 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-etc-openvswitch\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.333697 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332636 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-etc-sysctl-d\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.333697 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332642 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/55c8d4e4-74a0-45e8-9e7e-a49c8861570c-device-dir\") pod \"aws-ebs-csi-driver-node-hxqst\" (UID: \"55c8d4e4-74a0-45e8-9e7e-a49c8861570c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.333697 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332660 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-etc-systemd\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.333697 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332690 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d8d9c074-5a2a-4898-b910-f1a16ffc62fc-hosts-file\") pod \"node-resolver-hl5qq\" (UID: \"d8d9c074-5a2a-4898-b910-f1a16ffc62fc\") " pod="openshift-dns/node-resolver-hl5qq" Apr 23 17:58:38.333697 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332715 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gl9j8\" (UniqueName: \"kubernetes.io/projected/d8d9c074-5a2a-4898-b910-f1a16ffc62fc-kube-api-access-gl9j8\") pod \"node-resolver-hl5qq\" (UID: \"d8d9c074-5a2a-4898-b910-f1a16ffc62fc\") " pod="openshift-dns/node-resolver-hl5qq" Apr 23 17:58:38.333697 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332608 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-lib-modules\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.333697 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332740 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/55c8d4e4-74a0-45e8-9e7e-a49c8861570c-registration-dir\") pod \"aws-ebs-csi-driver-node-hxqst\" (UID: \"55c8d4e4-74a0-45e8-9e7e-a49c8861570c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.333697 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332766 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-etc-sysctl-d\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.333697 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332767 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-host-kubelet\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.333697 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332805 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-host-run-netns\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.333697 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332820 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9d9582b8-817a-4d02-862f-e5bbde6a1652-cni-binary-copy\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.333697 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332822 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/55c8d4e4-74a0-45e8-9e7e-a49c8861570c-registration-dir\") pod \"aws-ebs-csi-driver-node-hxqst\" (UID: \"55c8d4e4-74a0-45e8-9e7e-a49c8861570c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.333697 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332833 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-etc-sysctl-conf\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.333697 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332863 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c04a156a-dd80-4859-a932-b0e25e9bce6b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.334389 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332876 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2j56\" (UniqueName: \"kubernetes.io/projected/fbe124fd-664e-4080-a965-56b14926b56f-kube-api-access-z2j56\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.334389 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332878 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d8d9c074-5a2a-4898-b910-f1a16ffc62fc-hosts-file\") pod \"node-resolver-hl5qq\" (UID: \"d8d9c074-5a2a-4898-b910-f1a16ffc62fc\") " pod="openshift-dns/node-resolver-hl5qq" Apr 23 17:58:38.334389 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332879 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-etc-systemd\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.334389 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332920 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68lzd\" (UniqueName: \"kubernetes.io/projected/c04a156a-dd80-4859-a932-b0e25e9bce6b-kube-api-access-68lzd\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.334389 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332952 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c04a156a-dd80-4859-a932-b0e25e9bce6b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.334389 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332453 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-multus-socket-dir-parent\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.334389 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.332981 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-etc-sysctl-conf\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.334389 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333028 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/94dfcde8-e6d8-4b6b-825e-40bb5305f5ef-agent-certs\") pod \"konnectivity-agent-qcp2q\" (UID: \"94dfcde8-e6d8-4b6b-825e-40bb5305f5ef\") " pod="kube-system/konnectivity-agent-qcp2q" Apr 23 17:58:38.334389 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333133 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wn2h\" (UniqueName: \"kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h\") pod \"network-check-target-gtd8z\" (UID: \"979ab58c-b655-4aab-94f9-8920472712df\") " pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:58:38.334389 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333176 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-var-lib-openvswitch\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.334389 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333206 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkwxr\" (UniqueName: \"kubernetes.io/projected/246e705b-d502-4b42-bea0-4b6149b86183-kube-api-access-dkwxr\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.334389 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333237 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-etc-modprobe-d\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.334389 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333263 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-run-systemd\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.334389 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333292 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.334389 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333321 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/246e705b-d502-4b42-bea0-4b6149b86183-ovnkube-script-lib\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.334389 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333349 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-os-release\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.334389 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333369 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fbe124fd-664e-4080-a965-56b14926b56f-etc-modprobe-d\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.335040 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333378 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c04a156a-dd80-4859-a932-b0e25e9bce6b-os-release\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.335040 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333417 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c04a156a-dd80-4859-a932-b0e25e9bce6b-cni-binary-copy\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.335040 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333443 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c04a156a-dd80-4859-a932-b0e25e9bce6b-os-release\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.335040 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333446 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/92df26fb-43f5-4d39-9c51-669235fa190e-iptables-alerter-script\") pod \"iptables-alerter-tprc2\" (UID: \"92df26fb-43f5-4d39-9c51-669235fa190e\") " pod="openshift-network-operator/iptables-alerter-tprc2" Apr 23 17:58:38.335040 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333444 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-os-release\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.335040 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333479 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/22a39804-db9b-4a6b-a927-b5f0bb1d22eb-serviceca\") pod \"node-ca-l4wm6\" (UID: \"22a39804-db9b-4a6b-a927-b5f0bb1d22eb\") " pod="openshift-image-registry/node-ca-l4wm6" Apr 23 17:58:38.335040 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333562 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-run-openvswitch\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.335040 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333637 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-host-run-ovn-kubernetes\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.335040 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333683 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/246e705b-d502-4b42-bea0-4b6149b86183-ovnkube-config\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.335040 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333714 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c04a156a-dd80-4859-a932-b0e25e9bce6b-system-cni-dir\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.335040 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333763 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c04a156a-dd80-4859-a932-b0e25e9bce6b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.335040 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333789 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/55c8d4e4-74a0-45e8-9e7e-a49c8861570c-sys-fs\") pod \"aws-ebs-csi-driver-node-hxqst\" (UID: \"55c8d4e4-74a0-45e8-9e7e-a49c8861570c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.335040 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333814 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-cnibin\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.335040 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333901 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9d9582b8-817a-4d02-862f-e5bbde6a1652-cnibin\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.335040 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333930 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c04a156a-dd80-4859-a932-b0e25e9bce6b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.335040 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333948 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/22a39804-db9b-4a6b-a927-b5f0bb1d22eb-serviceca\") pod \"node-ca-l4wm6\" (UID: \"22a39804-db9b-4a6b-a927-b5f0bb1d22eb\") " pod="openshift-image-registry/node-ca-l4wm6" Apr 23 17:58:38.335040 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333955 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/55c8d4e4-74a0-45e8-9e7e-a49c8861570c-sys-fs\") pod \"aws-ebs-csi-driver-node-hxqst\" (UID: \"55c8d4e4-74a0-45e8-9e7e-a49c8861570c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.335755 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333962 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c04a156a-dd80-4859-a932-b0e25e9bce6b-cni-binary-copy\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.335755 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.333980 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c04a156a-dd80-4859-a932-b0e25e9bce6b-system-cni-dir\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.335955 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.335933 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fbe124fd-664e-4080-a965-56b14926b56f-etc-tuned\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.336018 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.335974 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fbe124fd-664e-4080-a965-56b14926b56f-tmp\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.341992 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.341703 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shtws\" (UniqueName: \"kubernetes.io/projected/55c8d4e4-74a0-45e8-9e7e-a49c8861570c-kube-api-access-shtws\") pod \"aws-ebs-csi-driver-node-hxqst\" (UID: \"55c8d4e4-74a0-45e8-9e7e-a49c8861570c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.341992 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.341895 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2j56\" (UniqueName: \"kubernetes.io/projected/fbe124fd-664e-4080-a965-56b14926b56f-kube-api-access-z2j56\") pod \"tuned-5gwc4\" (UID: \"fbe124fd-664e-4080-a965-56b14926b56f\") " pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.341992 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.341933 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68lzd\" (UniqueName: \"kubernetes.io/projected/c04a156a-dd80-4859-a932-b0e25e9bce6b-kube-api-access-68lzd\") pod \"multus-additional-cni-plugins-8d8vh\" (UID: \"c04a156a-dd80-4859-a932-b0e25e9bce6b\") " pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.342660 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.342640 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klz59\" (UniqueName: \"kubernetes.io/projected/22a39804-db9b-4a6b-a927-b5f0bb1d22eb-kube-api-access-klz59\") pod \"node-ca-l4wm6\" (UID: \"22a39804-db9b-4a6b-a927-b5f0bb1d22eb\") " pod="openshift-image-registry/node-ca-l4wm6" Apr 23 17:58:38.342944 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.342923 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl9j8\" (UniqueName: \"kubernetes.io/projected/d8d9c074-5a2a-4898-b910-f1a16ffc62fc-kube-api-access-gl9j8\") pod \"node-resolver-hl5qq\" (UID: \"d8d9c074-5a2a-4898-b910-f1a16ffc62fc\") " pod="openshift-dns/node-resolver-hl5qq" Apr 23 17:58:38.343252 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.343237 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz2mp\" (UniqueName: \"kubernetes.io/projected/9d9582b8-817a-4d02-862f-e5bbde6a1652-kube-api-access-hz2mp\") pod \"multus-rtg5v\" (UID: \"9d9582b8-817a-4d02-862f-e5bbde6a1652\") " pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.358153 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.358125 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-63.ec2.internal" event={"ID":"858e9dbb3f829b7de0afbb8a36ca323c","Type":"ContainerStarted","Data":"ea94625b7f60ef8b015b547be4dacb958d2b3e6a241c7b8814f2f373fe76bc70"} Apr 23 17:58:38.434182 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434094 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/94dfcde8-e6d8-4b6b-825e-40bb5305f5ef-konnectivity-ca\") pod \"konnectivity-agent-qcp2q\" (UID: \"94dfcde8-e6d8-4b6b-825e-40bb5305f5ef\") " pod="kube-system/konnectivity-agent-qcp2q" Apr 23 17:58:38.434182 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434145 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfrp4\" (UniqueName: \"kubernetes.io/projected/c5673cab-427f-416d-a4ba-94ac7c29dc9c-kube-api-access-sfrp4\") pod \"network-metrics-daemon-xwp2q\" (UID: \"c5673cab-427f-416d-a4ba-94ac7c29dc9c\") " pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:58:38.434182 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434171 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-systemd-units\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.434425 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434194 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-host-cni-bin\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.434425 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434221 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-log-socket\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.434425 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434244 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/92df26fb-43f5-4d39-9c51-669235fa190e-host-slash\") pod \"iptables-alerter-tprc2\" (UID: \"92df26fb-43f5-4d39-9c51-669235fa190e\") " pod="openshift-network-operator/iptables-alerter-tprc2" Apr 23 17:58:38.434425 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434286 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/92df26fb-43f5-4d39-9c51-669235fa190e-host-slash\") pod \"iptables-alerter-tprc2\" (UID: \"92df26fb-43f5-4d39-9c51-669235fa190e\") " pod="openshift-network-operator/iptables-alerter-tprc2" Apr 23 17:58:38.434425 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434291 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-host-cni-bin\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.434425 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434280 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-systemd-units\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.434425 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434308 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-log-socket\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.434425 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434335 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-host-cni-netd\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.434425 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434367 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs\") pod \"network-metrics-daemon-xwp2q\" (UID: \"c5673cab-427f-416d-a4ba-94ac7c29dc9c\") " pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:58:38.434425 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434384 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-node-log\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.434425 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434389 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-host-cni-netd\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.434425 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434424 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-node-log\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.434929 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:38.434448 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:38.434929 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434471 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-run-ovn\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.434929 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:38.434523 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs podName:c5673cab-427f-416d-a4ba-94ac7c29dc9c nodeName:}" failed. No retries permitted until 2026-04-23 17:58:38.934503291 +0000 UTC m=+3.168245190 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs") pod "network-metrics-daemon-xwp2q" (UID: "c5673cab-427f-416d-a4ba-94ac7c29dc9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:38.434929 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434546 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-run-ovn\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.434929 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434557 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ftk9\" (UniqueName: \"kubernetes.io/projected/92df26fb-43f5-4d39-9c51-669235fa190e-kube-api-access-9ftk9\") pod \"iptables-alerter-tprc2\" (UID: \"92df26fb-43f5-4d39-9c51-669235fa190e\") " pod="openshift-network-operator/iptables-alerter-tprc2" Apr 23 17:58:38.434929 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434586 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/246e705b-d502-4b42-bea0-4b6149b86183-env-overrides\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.434929 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434611 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/246e705b-d502-4b42-bea0-4b6149b86183-ovn-node-metrics-cert\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.434929 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434654 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/94dfcde8-e6d8-4b6b-825e-40bb5305f5ef-konnectivity-ca\") pod \"konnectivity-agent-qcp2q\" (UID: \"94dfcde8-e6d8-4b6b-825e-40bb5305f5ef\") " pod="kube-system/konnectivity-agent-qcp2q" Apr 23 17:58:38.434929 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434758 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-host-slash\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.434929 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434786 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-etc-openvswitch\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.434929 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434815 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-host-kubelet\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.434929 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434818 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-host-slash\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.434929 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434837 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-host-run-netns\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.434929 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434872 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-etc-openvswitch\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.434929 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434880 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/94dfcde8-e6d8-4b6b-825e-40bb5305f5ef-agent-certs\") pod \"konnectivity-agent-qcp2q\" (UID: \"94dfcde8-e6d8-4b6b-825e-40bb5305f5ef\") " pod="kube-system/konnectivity-agent-qcp2q" Apr 23 17:58:38.434929 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434906 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wn2h\" (UniqueName: \"kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h\") pod \"network-check-target-gtd8z\" (UID: \"979ab58c-b655-4aab-94f9-8920472712df\") " pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:58:38.434929 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434914 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-host-kubelet\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.435692 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434930 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-var-lib-openvswitch\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.435692 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434955 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkwxr\" (UniqueName: \"kubernetes.io/projected/246e705b-d502-4b42-bea0-4b6149b86183-kube-api-access-dkwxr\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.435692 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434959 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-host-run-netns\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.435692 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.434981 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-run-systemd\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.435692 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.435006 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.435692 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.435016 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-var-lib-openvswitch\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.435692 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.435031 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/246e705b-d502-4b42-bea0-4b6149b86183-ovnkube-script-lib\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.435692 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.435060 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/92df26fb-43f5-4d39-9c51-669235fa190e-iptables-alerter-script\") pod \"iptables-alerter-tprc2\" (UID: \"92df26fb-43f5-4d39-9c51-669235fa190e\") " pod="openshift-network-operator/iptables-alerter-tprc2" Apr 23 17:58:38.435692 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.435083 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-run-openvswitch\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.435692 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.435104 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-host-run-ovn-kubernetes\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.435692 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.435127 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/246e705b-d502-4b42-bea0-4b6149b86183-ovnkube-config\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.435692 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.435178 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/246e705b-d502-4b42-bea0-4b6149b86183-env-overrides\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.435692 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.435288 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-run-openvswitch\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.435692 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.435357 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.435692 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.435400 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-host-run-ovn-kubernetes\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.435692 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.435409 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/246e705b-d502-4b42-bea0-4b6149b86183-run-systemd\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.435692 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.435617 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/246e705b-d502-4b42-bea0-4b6149b86183-ovnkube-config\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.436436 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.435818 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/92df26fb-43f5-4d39-9c51-669235fa190e-iptables-alerter-script\") pod \"iptables-alerter-tprc2\" (UID: \"92df26fb-43f5-4d39-9c51-669235fa190e\") " pod="openshift-network-operator/iptables-alerter-tprc2" Apr 23 17:58:38.436436 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.436303 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/246e705b-d502-4b42-bea0-4b6149b86183-ovnkube-script-lib\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.437088 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.437066 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/246e705b-d502-4b42-bea0-4b6149b86183-ovn-node-metrics-cert\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.437447 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.437428 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/94dfcde8-e6d8-4b6b-825e-40bb5305f5ef-agent-certs\") pod \"konnectivity-agent-qcp2q\" (UID: \"94dfcde8-e6d8-4b6b-825e-40bb5305f5ef\") " pod="kube-system/konnectivity-agent-qcp2q" Apr 23 17:58:38.441433 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:38.441410 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:58:38.441433 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:38.441433 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:58:38.441606 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:38.441445 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6wn2h for pod openshift-network-diagnostics/network-check-target-gtd8z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:38.441606 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:38.441514 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h podName:979ab58c-b655-4aab-94f9-8920472712df nodeName:}" failed. No retries permitted until 2026-04-23 17:58:38.941498081 +0000 UTC m=+3.175239993 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6wn2h" (UniqueName: "kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h") pod "network-check-target-gtd8z" (UID: "979ab58c-b655-4aab-94f9-8920472712df") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:38.443041 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.443016 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfrp4\" (UniqueName: \"kubernetes.io/projected/c5673cab-427f-416d-a4ba-94ac7c29dc9c-kube-api-access-sfrp4\") pod \"network-metrics-daemon-xwp2q\" (UID: \"c5673cab-427f-416d-a4ba-94ac7c29dc9c\") " pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:58:38.443808 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.443786 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ftk9\" (UniqueName: \"kubernetes.io/projected/92df26fb-43f5-4d39-9c51-669235fa190e-kube-api-access-9ftk9\") pod \"iptables-alerter-tprc2\" (UID: \"92df26fb-43f5-4d39-9c51-669235fa190e\") " pod="openshift-network-operator/iptables-alerter-tprc2" Apr 23 17:58:38.444210 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.444191 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkwxr\" (UniqueName: \"kubernetes.io/projected/246e705b-d502-4b42-bea0-4b6149b86183-kube-api-access-dkwxr\") pod \"ovnkube-node-978cv\" (UID: \"246e705b-d502-4b42-bea0-4b6149b86183\") " pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.534577 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.534545 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rtg5v" Apr 23 17:58:38.541292 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.541266 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hl5qq" Apr 23 17:58:38.550920 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.550902 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" Apr 23 17:58:38.556465 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.556446 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l4wm6" Apr 23 17:58:38.562036 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.562010 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8d8vh" Apr 23 17:58:38.569579 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.569558 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tprc2" Apr 23 17:58:38.576181 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.576161 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" Apr 23 17:58:38.583728 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.583708 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-qcp2q" Apr 23 17:58:38.588321 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.588303 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:58:38.939189 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:38.939155 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs\") pod \"network-metrics-daemon-xwp2q\" (UID: \"c5673cab-427f-416d-a4ba-94ac7c29dc9c\") " pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:58:38.939355 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:38.939313 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:38.939418 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:38.939389 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs podName:c5673cab-427f-416d-a4ba-94ac7c29dc9c nodeName:}" failed. No retries permitted until 2026-04-23 17:58:39.939369152 +0000 UTC m=+4.173111066 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs") pod "network-metrics-daemon-xwp2q" (UID: "c5673cab-427f-416d-a4ba-94ac7c29dc9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:38.962787 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:38.962754 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8d9c074_5a2a_4898_b910_f1a16ffc62fc.slice/crio-d34d896cad57c20fe56cc1511d8d87e54e37603698946bb2dc490af246f7d3d0 WatchSource:0}: Error finding container d34d896cad57c20fe56cc1511d8d87e54e37603698946bb2dc490af246f7d3d0: Status 404 returned error can't find the container with id d34d896cad57c20fe56cc1511d8d87e54e37603698946bb2dc490af246f7d3d0 Apr 23 17:58:38.964097 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:38.964039 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod246e705b_d502_4b42_bea0_4b6149b86183.slice/crio-2978438abf7fbb1f95bbf3641a147af5927332c74c2aa10f82b683d25814bb4a WatchSource:0}: Error finding container 2978438abf7fbb1f95bbf3641a147af5927332c74c2aa10f82b683d25814bb4a: Status 404 returned error can't find the container with id 2978438abf7fbb1f95bbf3641a147af5927332c74c2aa10f82b683d25814bb4a Apr 23 17:58:38.965742 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:38.965624 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55c8d4e4_74a0_45e8_9e7e_a49c8861570c.slice/crio-3e776d187ff48ceda6074aa5cb7c4720e8c5fb9f1c4d11588be5da444899bf4a WatchSource:0}: Error finding container 3e776d187ff48ceda6074aa5cb7c4720e8c5fb9f1c4d11588be5da444899bf4a: Status 404 returned error can't find the container with id 3e776d187ff48ceda6074aa5cb7c4720e8c5fb9f1c4d11588be5da444899bf4a Apr 23 17:58:38.968227 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:38.968191 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbe124fd_664e_4080_a965_56b14926b56f.slice/crio-a47027063bf3f0ca8a891aecc8e832f2cd6f7026eefa83b2465c0e358c7f5d45 WatchSource:0}: Error finding container a47027063bf3f0ca8a891aecc8e832f2cd6f7026eefa83b2465c0e358c7f5d45: Status 404 returned error can't find the container with id a47027063bf3f0ca8a891aecc8e832f2cd6f7026eefa83b2465c0e358c7f5d45 Apr 23 17:58:38.970047 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:38.970012 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94dfcde8_e6d8_4b6b_825e_40bb5305f5ef.slice/crio-0621d9a06f8818725023af5715b0c8216d714d49dcc710ab515e8141f89c6420 WatchSource:0}: Error finding container 0621d9a06f8818725023af5715b0c8216d714d49dcc710ab515e8141f89c6420: Status 404 returned error can't find the container with id 0621d9a06f8818725023af5715b0c8216d714d49dcc710ab515e8141f89c6420 Apr 23 17:58:38.971398 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:38.971374 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92df26fb_43f5_4d39_9c51_669235fa190e.slice/crio-aa44a2d2bcd2ccf0f55779459d97e7381008495f17ebaedb1b014ba1926d104b WatchSource:0}: Error finding container aa44a2d2bcd2ccf0f55779459d97e7381008495f17ebaedb1b014ba1926d104b: Status 404 returned error can't find the container with id aa44a2d2bcd2ccf0f55779459d97e7381008495f17ebaedb1b014ba1926d104b Apr 23 17:58:38.972341 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:58:38.972316 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc04a156a_dd80_4859_a932_b0e25e9bce6b.slice/crio-652d9d318f8fa5171940d051472c5d33e850bc8831244ce0182b1ce6b146f4e0 WatchSource:0}: Error finding container 652d9d318f8fa5171940d051472c5d33e850bc8831244ce0182b1ce6b146f4e0: Status 404 returned error can't find the container with id 652d9d318f8fa5171940d051472c5d33e850bc8831244ce0182b1ce6b146f4e0 Apr 23 17:58:39.039676 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.039651 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wn2h\" (UniqueName: \"kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h\") pod \"network-check-target-gtd8z\" (UID: \"979ab58c-b655-4aab-94f9-8920472712df\") " pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:58:39.039794 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:39.039777 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:58:39.039830 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:39.039798 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:58:39.039830 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:39.039808 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6wn2h for pod openshift-network-diagnostics/network-check-target-gtd8z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:39.039890 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:39.039850 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h podName:979ab58c-b655-4aab-94f9-8920472712df nodeName:}" failed. No retries permitted until 2026-04-23 17:58:40.03983623 +0000 UTC m=+4.273578137 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-6wn2h" (UniqueName: "kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h") pod "network-check-target-gtd8z" (UID: "979ab58c-b655-4aab-94f9-8920472712df") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:39.259944 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.259737 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:53:37 +0000 UTC" deadline="2027-12-09 09:03:32.368345818 +0000 UTC" Apr 23 17:58:39.259944 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.259775 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14271h4m53.108574104s" Apr 23 17:58:39.315058 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.314293 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-wm4q6"] Apr 23 17:58:39.316917 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.316397 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:39.316917 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:39.316471 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wm4q6" podUID="c32531ab-73c0-4407-990e-7be86cd675cc" Apr 23 17:58:39.341126 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.341088 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c32531ab-73c0-4407-990e-7be86cd675cc-original-pull-secret\") pod \"global-pull-secret-syncer-wm4q6\" (UID: \"c32531ab-73c0-4407-990e-7be86cd675cc\") " pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:39.341252 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.341163 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c32531ab-73c0-4407-990e-7be86cd675cc-dbus\") pod \"global-pull-secret-syncer-wm4q6\" (UID: \"c32531ab-73c0-4407-990e-7be86cd675cc\") " pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:39.341252 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.341197 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c32531ab-73c0-4407-990e-7be86cd675cc-kubelet-config\") pod \"global-pull-secret-syncer-wm4q6\" (UID: \"c32531ab-73c0-4407-990e-7be86cd675cc\") " pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:39.353558 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.353488 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:58:39.353684 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:39.353616 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gtd8z" podUID="979ab58c-b655-4aab-94f9-8920472712df" Apr 23 17:58:39.365952 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.365877 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l4wm6" event={"ID":"22a39804-db9b-4a6b-a927-b5f0bb1d22eb","Type":"ContainerStarted","Data":"c6bd03e64ea9deaf7e68c04d732b0dbc2e015719a96237e78d34b4fc5d7419ea"} Apr 23 17:58:39.367930 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.367905 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8d8vh" event={"ID":"c04a156a-dd80-4859-a932-b0e25e9bce6b","Type":"ContainerStarted","Data":"652d9d318f8fa5171940d051472c5d33e850bc8831244ce0182b1ce6b146f4e0"} Apr 23 17:58:39.369253 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.369232 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" event={"ID":"55c8d4e4-74a0-45e8-9e7e-a49c8861570c","Type":"ContainerStarted","Data":"3e776d187ff48ceda6074aa5cb7c4720e8c5fb9f1c4d11588be5da444899bf4a"} Apr 23 17:58:39.370367 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.370344 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hl5qq" event={"ID":"d8d9c074-5a2a-4898-b910-f1a16ffc62fc","Type":"ContainerStarted","Data":"d34d896cad57c20fe56cc1511d8d87e54e37603698946bb2dc490af246f7d3d0"} Apr 23 17:58:39.375325 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.375208 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rtg5v" event={"ID":"9d9582b8-817a-4d02-862f-e5bbde6a1652","Type":"ContainerStarted","Data":"dd71b38cbf3f14a559c157efc52e6a85b2814b328fa090f686aab5bb08e376cf"} Apr 23 17:58:39.377889 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.377868 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tprc2" event={"ID":"92df26fb-43f5-4d39-9c51-669235fa190e","Type":"ContainerStarted","Data":"aa44a2d2bcd2ccf0f55779459d97e7381008495f17ebaedb1b014ba1926d104b"} Apr 23 17:58:39.379847 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.379732 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-qcp2q" event={"ID":"94dfcde8-e6d8-4b6b-825e-40bb5305f5ef","Type":"ContainerStarted","Data":"0621d9a06f8818725023af5715b0c8216d714d49dcc710ab515e8141f89c6420"} Apr 23 17:58:39.382071 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.382045 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" event={"ID":"fbe124fd-664e-4080-a965-56b14926b56f","Type":"ContainerStarted","Data":"a47027063bf3f0ca8a891aecc8e832f2cd6f7026eefa83b2465c0e358c7f5d45"} Apr 23 17:58:39.383851 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.383730 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-978cv" event={"ID":"246e705b-d502-4b42-bea0-4b6149b86183","Type":"ContainerStarted","Data":"2978438abf7fbb1f95bbf3641a147af5927332c74c2aa10f82b683d25814bb4a"} Apr 23 17:58:39.388167 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.388145 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-63.ec2.internal" event={"ID":"858e9dbb3f829b7de0afbb8a36ca323c","Type":"ContainerStarted","Data":"6a188c3bdfc909cfda5af90ddb0aa045698f3071b3351e90d5e52b9d69ab0cf4"} Apr 23 17:58:39.441746 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.441714 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c32531ab-73c0-4407-990e-7be86cd675cc-original-pull-secret\") pod \"global-pull-secret-syncer-wm4q6\" (UID: \"c32531ab-73c0-4407-990e-7be86cd675cc\") " pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:39.441880 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.441782 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c32531ab-73c0-4407-990e-7be86cd675cc-dbus\") pod \"global-pull-secret-syncer-wm4q6\" (UID: \"c32531ab-73c0-4407-990e-7be86cd675cc\") " pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:39.441880 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.441814 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c32531ab-73c0-4407-990e-7be86cd675cc-kubelet-config\") pod \"global-pull-secret-syncer-wm4q6\" (UID: \"c32531ab-73c0-4407-990e-7be86cd675cc\") " pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:39.442003 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.441923 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c32531ab-73c0-4407-990e-7be86cd675cc-kubelet-config\") pod \"global-pull-secret-syncer-wm4q6\" (UID: \"c32531ab-73c0-4407-990e-7be86cd675cc\") " pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:39.442063 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:39.442044 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:39.442119 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:39.442102 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c32531ab-73c0-4407-990e-7be86cd675cc-original-pull-secret podName:c32531ab-73c0-4407-990e-7be86cd675cc nodeName:}" failed. No retries permitted until 2026-04-23 17:58:39.94208375 +0000 UTC m=+4.175825660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c32531ab-73c0-4407-990e-7be86cd675cc-original-pull-secret") pod "global-pull-secret-syncer-wm4q6" (UID: "c32531ab-73c0-4407-990e-7be86cd675cc") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:39.442430 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.442409 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c32531ab-73c0-4407-990e-7be86cd675cc-dbus\") pod \"global-pull-secret-syncer-wm4q6\" (UID: \"c32531ab-73c0-4407-990e-7be86cd675cc\") " pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:39.951572 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.945598 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c32531ab-73c0-4407-990e-7be86cd675cc-original-pull-secret\") pod \"global-pull-secret-syncer-wm4q6\" (UID: \"c32531ab-73c0-4407-990e-7be86cd675cc\") " pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:39.951572 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:39.945661 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs\") pod \"network-metrics-daemon-xwp2q\" (UID: \"c5673cab-427f-416d-a4ba-94ac7c29dc9c\") " pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:58:39.951572 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:39.949240 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:39.951572 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:39.949309 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c32531ab-73c0-4407-990e-7be86cd675cc-original-pull-secret podName:c32531ab-73c0-4407-990e-7be86cd675cc nodeName:}" failed. No retries permitted until 2026-04-23 17:58:40.949290226 +0000 UTC m=+5.183032119 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c32531ab-73c0-4407-990e-7be86cd675cc-original-pull-secret") pod "global-pull-secret-syncer-wm4q6" (UID: "c32531ab-73c0-4407-990e-7be86cd675cc") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:39.951572 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:39.949745 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:39.951572 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:39.949813 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs podName:c5673cab-427f-416d-a4ba-94ac7c29dc9c nodeName:}" failed. No retries permitted until 2026-04-23 17:58:41.949795123 +0000 UTC m=+6.183537022 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs") pod "network-metrics-daemon-xwp2q" (UID: "c5673cab-427f-416d-a4ba-94ac7c29dc9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:40.047512 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:40.046848 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wn2h\" (UniqueName: \"kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h\") pod \"network-check-target-gtd8z\" (UID: \"979ab58c-b655-4aab-94f9-8920472712df\") " pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:58:40.047512 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:40.047051 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:58:40.047512 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:40.047070 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:58:40.047512 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:40.047083 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6wn2h for pod openshift-network-diagnostics/network-check-target-gtd8z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:40.047512 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:40.047140 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h podName:979ab58c-b655-4aab-94f9-8920472712df nodeName:}" failed. No retries permitted until 2026-04-23 17:58:42.047122802 +0000 UTC m=+6.280864701 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-6wn2h" (UniqueName: "kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h") pod "network-check-target-gtd8z" (UID: "979ab58c-b655-4aab-94f9-8920472712df") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:40.363818 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:40.363745 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:58:40.364233 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:40.363877 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xwp2q" podUID="c5673cab-427f-416d-a4ba-94ac7c29dc9c" Apr 23 17:58:40.410898 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:40.410861 2578 generic.go:358] "Generic (PLEG): container finished" podID="0b7732360440c87ff89761ec4d043808" containerID="aae7286723e6b03191569db42d5a98495d2ab72884902a3084e40ee7b7aea8c9" exitCode=0 Apr 23 17:58:40.411757 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:40.411733 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-63.ec2.internal" event={"ID":"0b7732360440c87ff89761ec4d043808","Type":"ContainerDied","Data":"aae7286723e6b03191569db42d5a98495d2ab72884902a3084e40ee7b7aea8c9"} Apr 23 17:58:40.429572 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:40.428722 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-63.ec2.internal" podStartSLOduration=3.428705525 podStartE2EDuration="3.428705525s" podCreationTimestamp="2026-04-23 17:58:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:58:39.40339421 +0000 UTC m=+3.637136126" watchObservedRunningTime="2026-04-23 17:58:40.428705525 +0000 UTC m=+4.662447439" Apr 23 17:58:40.955437 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:40.955307 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c32531ab-73c0-4407-990e-7be86cd675cc-original-pull-secret\") pod \"global-pull-secret-syncer-wm4q6\" (UID: \"c32531ab-73c0-4407-990e-7be86cd675cc\") " pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:40.955601 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:40.955545 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:40.955669 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:40.955610 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c32531ab-73c0-4407-990e-7be86cd675cc-original-pull-secret podName:c32531ab-73c0-4407-990e-7be86cd675cc nodeName:}" failed. No retries permitted until 2026-04-23 17:58:42.955590976 +0000 UTC m=+7.189332875 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c32531ab-73c0-4407-990e-7be86cd675cc-original-pull-secret") pod "global-pull-secret-syncer-wm4q6" (UID: "c32531ab-73c0-4407-990e-7be86cd675cc") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:41.353069 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:41.352995 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:41.353217 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:41.353119 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wm4q6" podUID="c32531ab-73c0-4407-990e-7be86cd675cc" Apr 23 17:58:41.353559 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:41.353514 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:58:41.353646 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:41.353616 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gtd8z" podUID="979ab58c-b655-4aab-94f9-8920472712df" Apr 23 17:58:41.425230 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:41.425195 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-63.ec2.internal" event={"ID":"0b7732360440c87ff89761ec4d043808","Type":"ContainerStarted","Data":"3c58c171edaf75b729da44db3ef47560db694621ad941eb3e5abec0a7a8ca29f"} Apr 23 17:58:41.964414 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:41.964374 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs\") pod \"network-metrics-daemon-xwp2q\" (UID: \"c5673cab-427f-416d-a4ba-94ac7c29dc9c\") " pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:58:41.964614 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:41.964509 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:41.964614 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:41.964585 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs podName:c5673cab-427f-416d-a4ba-94ac7c29dc9c nodeName:}" failed. No retries permitted until 2026-04-23 17:58:45.964565086 +0000 UTC m=+10.198306996 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs") pod "network-metrics-daemon-xwp2q" (UID: "c5673cab-427f-416d-a4ba-94ac7c29dc9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:42.065582 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:42.065504 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wn2h\" (UniqueName: \"kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h\") pod \"network-check-target-gtd8z\" (UID: \"979ab58c-b655-4aab-94f9-8920472712df\") " pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:58:42.065764 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:42.065672 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:58:42.065764 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:42.065695 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:58:42.065764 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:42.065707 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6wn2h for pod openshift-network-diagnostics/network-check-target-gtd8z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:42.065764 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:42.065763 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h podName:979ab58c-b655-4aab-94f9-8920472712df nodeName:}" failed. No retries permitted until 2026-04-23 17:58:46.065742655 +0000 UTC m=+10.299484552 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-6wn2h" (UniqueName: "kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h") pod "network-check-target-gtd8z" (UID: "979ab58c-b655-4aab-94f9-8920472712df") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:42.354164 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:42.354086 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:58:42.354322 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:42.354232 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xwp2q" podUID="c5673cab-427f-416d-a4ba-94ac7c29dc9c" Apr 23 17:58:42.973626 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:42.973586 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c32531ab-73c0-4407-990e-7be86cd675cc-original-pull-secret\") pod \"global-pull-secret-syncer-wm4q6\" (UID: \"c32531ab-73c0-4407-990e-7be86cd675cc\") " pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:42.974050 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:42.973741 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:42.974050 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:42.973802 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c32531ab-73c0-4407-990e-7be86cd675cc-original-pull-secret podName:c32531ab-73c0-4407-990e-7be86cd675cc nodeName:}" failed. No retries permitted until 2026-04-23 17:58:46.973784238 +0000 UTC m=+11.207526157 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c32531ab-73c0-4407-990e-7be86cd675cc-original-pull-secret") pod "global-pull-secret-syncer-wm4q6" (UID: "c32531ab-73c0-4407-990e-7be86cd675cc") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:43.353301 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:43.353203 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:43.353301 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:43.353215 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:58:43.353566 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:43.353352 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wm4q6" podUID="c32531ab-73c0-4407-990e-7be86cd675cc" Apr 23 17:58:43.353566 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:43.353482 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gtd8z" podUID="979ab58c-b655-4aab-94f9-8920472712df" Apr 23 17:58:44.353616 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:44.353585 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:58:44.354049 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:44.353732 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xwp2q" podUID="c5673cab-427f-416d-a4ba-94ac7c29dc9c" Apr 23 17:58:45.353700 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:45.353655 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:45.354161 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:45.353786 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wm4q6" podUID="c32531ab-73c0-4407-990e-7be86cd675cc" Apr 23 17:58:45.354161 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:45.353663 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:58:45.354161 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:45.353885 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gtd8z" podUID="979ab58c-b655-4aab-94f9-8920472712df" Apr 23 17:58:45.996871 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:45.996792 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs\") pod \"network-metrics-daemon-xwp2q\" (UID: \"c5673cab-427f-416d-a4ba-94ac7c29dc9c\") " pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:58:45.997053 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:45.996924 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:45.997053 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:45.997006 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs podName:c5673cab-427f-416d-a4ba-94ac7c29dc9c nodeName:}" failed. No retries permitted until 2026-04-23 17:58:53.996985092 +0000 UTC m=+18.230727004 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs") pod "network-metrics-daemon-xwp2q" (UID: "c5673cab-427f-416d-a4ba-94ac7c29dc9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:46.097371 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:46.097332 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wn2h\" (UniqueName: \"kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h\") pod \"network-check-target-gtd8z\" (UID: \"979ab58c-b655-4aab-94f9-8920472712df\") " pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:58:46.097556 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:46.097492 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:58:46.097556 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:46.097510 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:58:46.097556 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:46.097523 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6wn2h for pod openshift-network-diagnostics/network-check-target-gtd8z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:46.097714 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:46.097592 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h podName:979ab58c-b655-4aab-94f9-8920472712df nodeName:}" failed. No retries permitted until 2026-04-23 17:58:54.097573887 +0000 UTC m=+18.331315794 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-6wn2h" (UniqueName: "kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h") pod "network-check-target-gtd8z" (UID: "979ab58c-b655-4aab-94f9-8920472712df") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:46.354381 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:46.354269 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:58:46.354842 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:46.354430 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xwp2q" podUID="c5673cab-427f-416d-a4ba-94ac7c29dc9c" Apr 23 17:58:47.004472 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:47.004427 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c32531ab-73c0-4407-990e-7be86cd675cc-original-pull-secret\") pod \"global-pull-secret-syncer-wm4q6\" (UID: \"c32531ab-73c0-4407-990e-7be86cd675cc\") " pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:47.004672 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:47.004625 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:47.004734 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:47.004694 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c32531ab-73c0-4407-990e-7be86cd675cc-original-pull-secret podName:c32531ab-73c0-4407-990e-7be86cd675cc nodeName:}" failed. No retries permitted until 2026-04-23 17:58:55.00467495 +0000 UTC m=+19.238416846 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c32531ab-73c0-4407-990e-7be86cd675cc-original-pull-secret") pod "global-pull-secret-syncer-wm4q6" (UID: "c32531ab-73c0-4407-990e-7be86cd675cc") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:47.353637 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:47.353505 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:47.353789 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:47.353659 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wm4q6" podUID="c32531ab-73c0-4407-990e-7be86cd675cc" Apr 23 17:58:47.353789 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:47.353685 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:58:47.353789 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:47.353770 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gtd8z" podUID="979ab58c-b655-4aab-94f9-8920472712df" Apr 23 17:58:48.353130 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:48.353096 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:58:48.353602 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:48.353232 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xwp2q" podUID="c5673cab-427f-416d-a4ba-94ac7c29dc9c" Apr 23 17:58:49.353483 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:49.353447 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:58:49.353921 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:49.353448 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:49.353921 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:49.353589 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gtd8z" podUID="979ab58c-b655-4aab-94f9-8920472712df" Apr 23 17:58:49.353921 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:49.353659 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wm4q6" podUID="c32531ab-73c0-4407-990e-7be86cd675cc" Apr 23 17:58:50.353591 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:50.353553 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:58:50.354025 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:50.353678 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xwp2q" podUID="c5673cab-427f-416d-a4ba-94ac7c29dc9c" Apr 23 17:58:51.353930 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:51.353891 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:51.354371 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:51.353903 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:58:51.354371 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:51.354015 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wm4q6" podUID="c32531ab-73c0-4407-990e-7be86cd675cc" Apr 23 17:58:51.354371 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:51.354120 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gtd8z" podUID="979ab58c-b655-4aab-94f9-8920472712df" Apr 23 17:58:52.353743 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:52.353712 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:58:52.353907 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:52.353842 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xwp2q" podUID="c5673cab-427f-416d-a4ba-94ac7c29dc9c" Apr 23 17:58:53.353642 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:53.353603 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:58:53.354142 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:53.353604 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:53.354142 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:53.353739 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gtd8z" podUID="979ab58c-b655-4aab-94f9-8920472712df" Apr 23 17:58:53.354142 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:53.353831 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wm4q6" podUID="c32531ab-73c0-4407-990e-7be86cd675cc" Apr 23 17:58:54.053867 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:54.053824 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs\") pod \"network-metrics-daemon-xwp2q\" (UID: \"c5673cab-427f-416d-a4ba-94ac7c29dc9c\") " pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:58:54.054050 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:54.053985 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:54.054118 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:54.054063 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs podName:c5673cab-427f-416d-a4ba-94ac7c29dc9c nodeName:}" failed. No retries permitted until 2026-04-23 17:59:10.054045449 +0000 UTC m=+34.287787353 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs") pod "network-metrics-daemon-xwp2q" (UID: "c5673cab-427f-416d-a4ba-94ac7c29dc9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:54.155223 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:54.155187 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wn2h\" (UniqueName: \"kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h\") pod \"network-check-target-gtd8z\" (UID: \"979ab58c-b655-4aab-94f9-8920472712df\") " pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:58:54.155427 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:54.155405 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:58:54.155510 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:54.155434 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:58:54.155510 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:54.155448 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6wn2h for pod openshift-network-diagnostics/network-check-target-gtd8z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:54.155643 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:54.155514 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h podName:979ab58c-b655-4aab-94f9-8920472712df nodeName:}" failed. No retries permitted until 2026-04-23 17:59:10.155495745 +0000 UTC m=+34.389237641 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-6wn2h" (UniqueName: "kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h") pod "network-check-target-gtd8z" (UID: "979ab58c-b655-4aab-94f9-8920472712df") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:54.353857 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:54.353773 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:58:54.354245 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:54.353908 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xwp2q" podUID="c5673cab-427f-416d-a4ba-94ac7c29dc9c" Apr 23 17:58:55.061159 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:55.061124 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c32531ab-73c0-4407-990e-7be86cd675cc-original-pull-secret\") pod \"global-pull-secret-syncer-wm4q6\" (UID: \"c32531ab-73c0-4407-990e-7be86cd675cc\") " pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:55.061353 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:55.061280 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:55.061422 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:55.061360 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c32531ab-73c0-4407-990e-7be86cd675cc-original-pull-secret podName:c32531ab-73c0-4407-990e-7be86cd675cc nodeName:}" failed. No retries permitted until 2026-04-23 17:59:11.061339127 +0000 UTC m=+35.295081024 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c32531ab-73c0-4407-990e-7be86cd675cc-original-pull-secret") pod "global-pull-secret-syncer-wm4q6" (UID: "c32531ab-73c0-4407-990e-7be86cd675cc") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:55.353184 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:55.353092 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:58:55.353327 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:55.353204 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gtd8z" podUID="979ab58c-b655-4aab-94f9-8920472712df" Apr 23 17:58:55.353327 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:55.353265 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:55.353431 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:55.353376 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wm4q6" podUID="c32531ab-73c0-4407-990e-7be86cd675cc" Apr 23 17:58:56.356255 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.356048 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:58:56.356950 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:56.356352 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xwp2q" podUID="c5673cab-427f-416d-a4ba-94ac7c29dc9c" Apr 23 17:58:56.449798 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.449731 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 17:58:56.450004 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.449981 2578 generic.go:358] "Generic (PLEG): container finished" podID="246e705b-d502-4b42-bea0-4b6149b86183" containerID="a6b793f7441f6ae67ced00e690c0b53b9763335e08051d4afa84341a19372b44" exitCode=1 Apr 23 17:58:56.450078 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.450054 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-978cv" event={"ID":"246e705b-d502-4b42-bea0-4b6149b86183","Type":"ContainerStarted","Data":"0e858c14082a797d14e9a861c5b62c6c9063dcb458ab00564019aa2ce096d729"} Apr 23 17:58:56.450143 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.450094 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-978cv" event={"ID":"246e705b-d502-4b42-bea0-4b6149b86183","Type":"ContainerStarted","Data":"30b6a7d2c7d3755e0269196f02b959be0b54bbf8dd869ad0fc63cc6a012c3615"} Apr 23 17:58:56.450143 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.450109 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-978cv" event={"ID":"246e705b-d502-4b42-bea0-4b6149b86183","Type":"ContainerStarted","Data":"e9a769fb2839bf8138f7cbc76975298e55e045c549a7685adcea05ccf14c8026"} Apr 23 17:58:56.450143 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.450121 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-978cv" event={"ID":"246e705b-d502-4b42-bea0-4b6149b86183","Type":"ContainerStarted","Data":"aa1220242d159cd63641640008d0d38ce83c01fb3c1a25003cbc9e300d0e3dd9"} Apr 23 17:58:56.450143 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.450134 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-978cv" event={"ID":"246e705b-d502-4b42-bea0-4b6149b86183","Type":"ContainerDied","Data":"a6b793f7441f6ae67ced00e690c0b53b9763335e08051d4afa84341a19372b44"} Apr 23 17:58:56.450287 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.450149 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-978cv" event={"ID":"246e705b-d502-4b42-bea0-4b6149b86183","Type":"ContainerStarted","Data":"1910f46008015ea77ff8690692eef07ecb7652590c7396f7e85545f332b1d767"} Apr 23 17:58:56.451187 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.451164 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l4wm6" event={"ID":"22a39804-db9b-4a6b-a927-b5f0bb1d22eb","Type":"ContainerStarted","Data":"7f8458ae4996f53998494a2fa28d0b3b22f19dbba1f5699b10703429b28cc4a2"} Apr 23 17:58:56.452752 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.452320 2578 generic.go:358] "Generic (PLEG): container finished" podID="c04a156a-dd80-4859-a932-b0e25e9bce6b" containerID="46e124990a48c5fd57da6ce06f474dd40b955ffdfbf6232ff0a004da6fb88c80" exitCode=0 Apr 23 17:58:56.452752 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.452375 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8d8vh" event={"ID":"c04a156a-dd80-4859-a932-b0e25e9bce6b","Type":"ContainerDied","Data":"46e124990a48c5fd57da6ce06f474dd40b955ffdfbf6232ff0a004da6fb88c80"} Apr 23 17:58:56.453604 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.453578 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" event={"ID":"55c8d4e4-74a0-45e8-9e7e-a49c8861570c","Type":"ContainerStarted","Data":"0da70fe4bdeca6783746e8038ab82226bad413b76583f8f1d8bb31e59f831dc6"} Apr 23 17:58:56.454987 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.454812 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hl5qq" event={"ID":"d8d9c074-5a2a-4898-b910-f1a16ffc62fc","Type":"ContainerStarted","Data":"01b369e62f97681453a7b5bf46e02378b3ace164685fd18c1d8bb14fb9c18189"} Apr 23 17:58:56.456122 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.456102 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rtg5v" event={"ID":"9d9582b8-817a-4d02-862f-e5bbde6a1652","Type":"ContainerStarted","Data":"c7ec1cc0cbeb68acb2a2b6f7b7b1c85226a9d200a8205800aa02512106f521c6"} Apr 23 17:58:56.457296 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.457278 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-qcp2q" event={"ID":"94dfcde8-e6d8-4b6b-825e-40bb5305f5ef","Type":"ContainerStarted","Data":"43f3c8ef2beea5eab40e15696fb6edfb868b73c2ebd0c837762df2cd96e6d827"} Apr 23 17:58:56.458442 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.458424 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" event={"ID":"fbe124fd-664e-4080-a965-56b14926b56f","Type":"ContainerStarted","Data":"7c2b6ab0beda52caea733a4a3f81f5f547797585d5ce34bc8eda30848e684f3b"} Apr 23 17:58:56.464710 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.464674 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-l4wm6" podStartSLOduration=11.808224979 podStartE2EDuration="20.464661182s" podCreationTimestamp="2026-04-23 17:58:36 +0000 UTC" firstStartedPulling="2026-04-23 17:58:38.976191848 +0000 UTC m=+3.209933755" lastFinishedPulling="2026-04-23 17:58:47.632628047 +0000 UTC m=+11.866369958" observedRunningTime="2026-04-23 17:58:56.464303517 +0000 UTC m=+20.698045432" watchObservedRunningTime="2026-04-23 17:58:56.464661182 +0000 UTC m=+20.698403095" Apr 23 17:58:56.464806 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.464751 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-63.ec2.internal" podStartSLOduration=19.464734881 podStartE2EDuration="19.464734881s" podCreationTimestamp="2026-04-23 17:58:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:58:41.441044007 +0000 UTC m=+5.674785922" watchObservedRunningTime="2026-04-23 17:58:56.464734881 +0000 UTC m=+20.698476796" Apr 23 17:58:56.477207 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.477155 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hl5qq" podStartSLOduration=4.025065145 podStartE2EDuration="20.477139429s" podCreationTimestamp="2026-04-23 17:58:36 +0000 UTC" firstStartedPulling="2026-04-23 17:58:38.964562715 +0000 UTC m=+3.198304611" lastFinishedPulling="2026-04-23 17:58:55.41663699 +0000 UTC m=+19.650378895" observedRunningTime="2026-04-23 17:58:56.477056233 +0000 UTC m=+20.710798180" watchObservedRunningTime="2026-04-23 17:58:56.477139429 +0000 UTC m=+20.710881345" Apr 23 17:58:56.507844 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.507798 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-5gwc4" podStartSLOduration=4.025637895 podStartE2EDuration="20.507779647s" podCreationTimestamp="2026-04-23 17:58:36 +0000 UTC" firstStartedPulling="2026-04-23 17:58:38.970970889 +0000 UTC m=+3.204712790" lastFinishedPulling="2026-04-23 17:58:55.453112637 +0000 UTC m=+19.686854542" observedRunningTime="2026-04-23 17:58:56.491842778 +0000 UTC m=+20.725584694" watchObservedRunningTime="2026-04-23 17:58:56.507779647 +0000 UTC m=+20.741521564" Apr 23 17:58:56.508234 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.508186 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rtg5v" podStartSLOduration=3.7376189699999998 podStartE2EDuration="20.508157631s" podCreationTimestamp="2026-04-23 17:58:36 +0000 UTC" firstStartedPulling="2026-04-23 17:58:38.975491461 +0000 UTC m=+3.209233356" lastFinishedPulling="2026-04-23 17:58:55.746030104 +0000 UTC m=+19.979772017" observedRunningTime="2026-04-23 17:58:56.507528318 +0000 UTC m=+20.741270234" watchObservedRunningTime="2026-04-23 17:58:56.508157631 +0000 UTC m=+20.741899546" Apr 23 17:58:56.543299 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.542720 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-qcp2q" podStartSLOduration=4.098498297 podStartE2EDuration="20.542705026s" podCreationTimestamp="2026-04-23 17:58:36 +0000 UTC" firstStartedPulling="2026-04-23 17:58:38.97242856 +0000 UTC m=+3.206170453" lastFinishedPulling="2026-04-23 17:58:55.416635275 +0000 UTC m=+19.650377182" observedRunningTime="2026-04-23 17:58:56.542462405 +0000 UTC m=+20.776204313" watchObservedRunningTime="2026-04-23 17:58:56.542705026 +0000 UTC m=+20.776446943" Apr 23 17:58:56.575347 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:56.575320 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 17:58:57.274999 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:57.274825 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T17:58:56.575342179Z","UUID":"447550d1-11c1-46de-8b62-0019e34d78c6","Handler":null,"Name":"","Endpoint":""} Apr 23 17:58:57.278511 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:57.278479 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 17:58:57.278511 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:57.278515 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 17:58:57.353148 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:57.353121 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:57.353275 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:57.353121 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:58:57.353275 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:57.353251 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wm4q6" podUID="c32531ab-73c0-4407-990e-7be86cd675cc" Apr 23 17:58:57.353391 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:57.353338 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gtd8z" podUID="979ab58c-b655-4aab-94f9-8920472712df" Apr 23 17:58:57.463384 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:57.463347 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tprc2" event={"ID":"92df26fb-43f5-4d39-9c51-669235fa190e","Type":"ContainerStarted","Data":"0d598dbb212389d157d4dd988e587a95e636b28a0ba91c404cd08aedfc0566dc"} Apr 23 17:58:57.466015 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:57.465977 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" event={"ID":"55c8d4e4-74a0-45e8-9e7e-a49c8861570c","Type":"ContainerStarted","Data":"49104a6036fc4d383c001e9fa151d2f1c4f745c13e3b5c7cb7cb70388f291952"} Apr 23 17:58:57.479666 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:57.479626 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-tprc2" podStartSLOduration=5.037169709 podStartE2EDuration="21.479608992s" podCreationTimestamp="2026-04-23 17:58:36 +0000 UTC" firstStartedPulling="2026-04-23 17:58:38.974220725 +0000 UTC m=+3.207962626" lastFinishedPulling="2026-04-23 17:58:55.41666001 +0000 UTC m=+19.650401909" observedRunningTime="2026-04-23 17:58:57.479032437 +0000 UTC m=+21.712774346" watchObservedRunningTime="2026-04-23 17:58:57.479608992 +0000 UTC m=+21.713350906" Apr 23 17:58:58.353490 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:58.353410 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:58:58.353682 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:58.353561 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xwp2q" podUID="c5673cab-427f-416d-a4ba-94ac7c29dc9c" Apr 23 17:58:58.470405 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:58.470376 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 17:58:58.470986 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:58.470760 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-978cv" event={"ID":"246e705b-d502-4b42-bea0-4b6149b86183","Type":"ContainerStarted","Data":"d57ae77e05e261a9019c04e58b55881f5205bbd4606643ddb3d2938a63f5a03d"} Apr 23 17:58:58.472611 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:58.472578 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" event={"ID":"55c8d4e4-74a0-45e8-9e7e-a49c8861570c","Type":"ContainerStarted","Data":"767cf534a85f3d64c62029b6dede333297f4fbd16b8284df170400c89f6b2880"} Apr 23 17:58:59.353304 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:59.353265 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:58:59.353489 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:58:59.353265 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:58:59.353489 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:59.353375 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gtd8z" podUID="979ab58c-b655-4aab-94f9-8920472712df" Apr 23 17:58:59.353489 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:58:59.353477 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wm4q6" podUID="c32531ab-73c0-4407-990e-7be86cd675cc" Apr 23 17:59:00.104272 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:00.104240 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-qcp2q" Apr 23 17:59:00.105163 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:00.105140 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-qcp2q" Apr 23 17:59:00.122904 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:00.122855 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hxqst" podStartSLOduration=5.736858561 podStartE2EDuration="24.122828631s" podCreationTimestamp="2026-04-23 17:58:36 +0000 UTC" firstStartedPulling="2026-04-23 17:58:38.967623297 +0000 UTC m=+3.201365192" lastFinishedPulling="2026-04-23 17:58:57.35359336 +0000 UTC m=+21.587335262" observedRunningTime="2026-04-23 17:58:58.489549658 +0000 UTC m=+22.723291575" watchObservedRunningTime="2026-04-23 17:59:00.122828631 +0000 UTC m=+24.356570547" Apr 23 17:59:00.353413 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:00.353247 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:59:00.353607 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:00.353488 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xwp2q" podUID="c5673cab-427f-416d-a4ba-94ac7c29dc9c" Apr 23 17:59:00.478732 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:00.478711 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 17:59:00.479089 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:00.479066 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-978cv" event={"ID":"246e705b-d502-4b42-bea0-4b6149b86183","Type":"ContainerStarted","Data":"45cb517e02ed9b0e2fa315aeaa864b98900315177f225a7a624b89058f757e76"} Apr 23 17:59:00.479875 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:00.479383 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-qcp2q" Apr 23 17:59:00.479875 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:00.479426 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:59:00.479875 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:00.479439 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:59:00.479875 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:00.479450 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:59:00.479875 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:00.479520 2578 scope.go:117] "RemoveContainer" containerID="a6b793f7441f6ae67ced00e690c0b53b9763335e08051d4afa84341a19372b44" Apr 23 17:59:00.479875 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:00.479846 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-qcp2q" Apr 23 17:59:00.496866 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:00.496837 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:59:00.497305 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:00.497288 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:59:01.353246 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:01.353215 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:59:01.353246 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:01.353249 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:59:01.353892 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:01.353324 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wm4q6" podUID="c32531ab-73c0-4407-990e-7be86cd675cc" Apr 23 17:59:01.353892 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:01.353359 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gtd8z" podUID="979ab58c-b655-4aab-94f9-8920472712df" Apr 23 17:59:01.483596 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:01.483506 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 17:59:01.483899 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:01.483870 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-978cv" event={"ID":"246e705b-d502-4b42-bea0-4b6149b86183","Type":"ContainerStarted","Data":"57dc7c3d576ec581aee962b5b33ec9b6082348340eff355711c21d08a06dd935"} Apr 23 17:59:01.485489 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:01.485465 2578 generic.go:358] "Generic (PLEG): container finished" podID="c04a156a-dd80-4859-a932-b0e25e9bce6b" containerID="ce7477f63de00fa6483d7562515b148f58b7a41b5d5558da48a1541750032afc" exitCode=0 Apr 23 17:59:01.485608 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:01.485554 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8d8vh" event={"ID":"c04a156a-dd80-4859-a932-b0e25e9bce6b","Type":"ContainerDied","Data":"ce7477f63de00fa6483d7562515b148f58b7a41b5d5558da48a1541750032afc"} Apr 23 17:59:01.513721 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:01.513672 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-978cv" podStartSLOduration=8.968118188 podStartE2EDuration="25.513653139s" podCreationTimestamp="2026-04-23 17:58:36 +0000 UTC" firstStartedPulling="2026-04-23 17:58:38.966023787 +0000 UTC m=+3.199765693" lastFinishedPulling="2026-04-23 17:58:55.511558733 +0000 UTC m=+19.745300644" observedRunningTime="2026-04-23 17:59:01.510918987 +0000 UTC m=+25.744660903" watchObservedRunningTime="2026-04-23 17:59:01.513653139 +0000 UTC m=+25.747395056" Apr 23 17:59:02.266881 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:02.266609 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-gtd8z"] Apr 23 17:59:02.267029 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:02.266944 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:59:02.267101 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:02.267054 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gtd8z" podUID="979ab58c-b655-4aab-94f9-8920472712df" Apr 23 17:59:02.267855 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:02.267825 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wm4q6"] Apr 23 17:59:02.267969 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:02.267918 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:59:02.268050 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:02.268024 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wm4q6" podUID="c32531ab-73c0-4407-990e-7be86cd675cc" Apr 23 17:59:02.279648 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:02.279626 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xwp2q"] Apr 23 17:59:02.279755 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:02.279718 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:59:02.279820 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:02.279786 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xwp2q" podUID="c5673cab-427f-416d-a4ba-94ac7c29dc9c" Apr 23 17:59:02.492554 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:02.492451 2578 generic.go:358] "Generic (PLEG): container finished" podID="c04a156a-dd80-4859-a932-b0e25e9bce6b" containerID="4cc5aec4bd48d8e816d7e7b31a032b3a2bcf17cb866ccd825825bc89b47a9676" exitCode=0 Apr 23 17:59:02.493218 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:02.492551 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8d8vh" event={"ID":"c04a156a-dd80-4859-a932-b0e25e9bce6b","Type":"ContainerDied","Data":"4cc5aec4bd48d8e816d7e7b31a032b3a2bcf17cb866ccd825825bc89b47a9676"} Apr 23 17:59:03.498337 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:03.498252 2578 generic.go:358] "Generic (PLEG): container finished" podID="c04a156a-dd80-4859-a932-b0e25e9bce6b" containerID="707af0936b6891aabbc185d0d09c04bfd72aa4d680fbaf2f330579053e602b14" exitCode=0 Apr 23 17:59:03.498337 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:03.498316 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8d8vh" event={"ID":"c04a156a-dd80-4859-a932-b0e25e9bce6b","Type":"ContainerDied","Data":"707af0936b6891aabbc185d0d09c04bfd72aa4d680fbaf2f330579053e602b14"} Apr 23 17:59:04.353472 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:04.353438 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:59:04.353472 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:04.353456 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:59:04.353697 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:04.353439 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:59:04.353697 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:04.353580 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gtd8z" podUID="979ab58c-b655-4aab-94f9-8920472712df" Apr 23 17:59:04.353697 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:04.353644 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wm4q6" podUID="c32531ab-73c0-4407-990e-7be86cd675cc" Apr 23 17:59:04.353851 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:04.353734 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xwp2q" podUID="c5673cab-427f-416d-a4ba-94ac7c29dc9c" Apr 23 17:59:06.355330 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:06.355020 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:59:06.355330 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:06.355086 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:59:06.355827 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:06.355339 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gtd8z" podUID="979ab58c-b655-4aab-94f9-8920472712df" Apr 23 17:59:06.355827 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:06.355417 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xwp2q" podUID="c5673cab-427f-416d-a4ba-94ac7c29dc9c" Apr 23 17:59:06.355827 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:06.355105 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:59:06.355827 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:06.355512 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wm4q6" podUID="c32531ab-73c0-4407-990e-7be86cd675cc" Apr 23 17:59:08.353860 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.353818 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:59:08.353860 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.353865 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:59:08.354351 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.353831 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:59:08.354351 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:08.353953 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xwp2q" podUID="c5673cab-427f-416d-a4ba-94ac7c29dc9c" Apr 23 17:59:08.354351 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:08.354019 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gtd8z" podUID="979ab58c-b655-4aab-94f9-8920472712df" Apr 23 17:59:08.354351 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:08.354111 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wm4q6" podUID="c32531ab-73c0-4407-990e-7be86cd675cc" Apr 23 17:59:08.589733 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.589651 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-63.ec2.internal" event="NodeReady" Apr 23 17:59:08.589889 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.589804 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 17:59:08.676511 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.676477 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-bgn7x"] Apr 23 17:59:08.711623 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.711521 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-86jl7"] Apr 23 17:59:08.711790 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.711685 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bgn7x" Apr 23 17:59:08.714663 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.714640 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 17:59:08.714873 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.714847 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-sv5q9\"" Apr 23 17:59:08.719399 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.719375 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 17:59:08.725523 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.725498 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bgn7x"] Apr 23 17:59:08.725523 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.725526 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-86jl7"] Apr 23 17:59:08.725694 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.725647 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-86jl7" Apr 23 17:59:08.727750 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.727728 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 17:59:08.727750 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.727747 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 17:59:08.727854 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.727832 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tsbqj\"" Apr 23 17:59:08.727961 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.727941 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 17:59:08.857943 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.857879 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5xjn\" (UniqueName: \"kubernetes.io/projected/ff148188-17a2-4b88-a857-ae14164f4a06-kube-api-access-k5xjn\") pod \"ingress-canary-86jl7\" (UID: \"ff148188-17a2-4b88-a857-ae14164f4a06\") " pod="openshift-ingress-canary/ingress-canary-86jl7" Apr 23 17:59:08.857943 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.857920 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsr7q\" (UniqueName: \"kubernetes.io/projected/5240d464-6fd9-4f8a-819f-0385f4314995-kube-api-access-nsr7q\") pod \"dns-default-bgn7x\" (UID: \"5240d464-6fd9-4f8a-819f-0385f4314995\") " pod="openshift-dns/dns-default-bgn7x" Apr 23 17:59:08.858093 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.857970 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5240d464-6fd9-4f8a-819f-0385f4314995-tmp-dir\") pod \"dns-default-bgn7x\" (UID: \"5240d464-6fd9-4f8a-819f-0385f4314995\") " pod="openshift-dns/dns-default-bgn7x" Apr 23 17:59:08.858093 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.857999 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert\") pod \"ingress-canary-86jl7\" (UID: \"ff148188-17a2-4b88-a857-ae14164f4a06\") " pod="openshift-ingress-canary/ingress-canary-86jl7" Apr 23 17:59:08.858093 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.858015 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls\") pod \"dns-default-bgn7x\" (UID: \"5240d464-6fd9-4f8a-819f-0385f4314995\") " pod="openshift-dns/dns-default-bgn7x" Apr 23 17:59:08.858093 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.858081 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5240d464-6fd9-4f8a-819f-0385f4314995-config-volume\") pod \"dns-default-bgn7x\" (UID: \"5240d464-6fd9-4f8a-819f-0385f4314995\") " pod="openshift-dns/dns-default-bgn7x" Apr 23 17:59:08.959169 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.959140 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5xjn\" (UniqueName: \"kubernetes.io/projected/ff148188-17a2-4b88-a857-ae14164f4a06-kube-api-access-k5xjn\") pod \"ingress-canary-86jl7\" (UID: \"ff148188-17a2-4b88-a857-ae14164f4a06\") " pod="openshift-ingress-canary/ingress-canary-86jl7" Apr 23 17:59:08.959169 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.959173 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsr7q\" (UniqueName: \"kubernetes.io/projected/5240d464-6fd9-4f8a-819f-0385f4314995-kube-api-access-nsr7q\") pod \"dns-default-bgn7x\" (UID: \"5240d464-6fd9-4f8a-819f-0385f4314995\") " pod="openshift-dns/dns-default-bgn7x" Apr 23 17:59:08.959336 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.959206 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5240d464-6fd9-4f8a-819f-0385f4314995-tmp-dir\") pod \"dns-default-bgn7x\" (UID: \"5240d464-6fd9-4f8a-819f-0385f4314995\") " pod="openshift-dns/dns-default-bgn7x" Apr 23 17:59:08.959336 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.959229 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert\") pod \"ingress-canary-86jl7\" (UID: \"ff148188-17a2-4b88-a857-ae14164f4a06\") " pod="openshift-ingress-canary/ingress-canary-86jl7" Apr 23 17:59:08.959336 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.959245 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls\") pod \"dns-default-bgn7x\" (UID: \"5240d464-6fd9-4f8a-819f-0385f4314995\") " pod="openshift-dns/dns-default-bgn7x" Apr 23 17:59:08.959336 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.959268 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5240d464-6fd9-4f8a-819f-0385f4314995-config-volume\") pod \"dns-default-bgn7x\" (UID: \"5240d464-6fd9-4f8a-819f-0385f4314995\") " pod="openshift-dns/dns-default-bgn7x" Apr 23 17:59:08.959506 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:08.959339 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:59:08.959506 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:08.959392 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:59:08.959506 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:08.959402 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert podName:ff148188-17a2-4b88-a857-ae14164f4a06 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:09.459385464 +0000 UTC m=+33.693127360 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert") pod "ingress-canary-86jl7" (UID: "ff148188-17a2-4b88-a857-ae14164f4a06") : secret "canary-serving-cert" not found Apr 23 17:59:08.959506 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:08.959459 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls podName:5240d464-6fd9-4f8a-819f-0385f4314995 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:09.45944086 +0000 UTC m=+33.693182759 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls") pod "dns-default-bgn7x" (UID: "5240d464-6fd9-4f8a-819f-0385f4314995") : secret "dns-default-metrics-tls" not found Apr 23 17:59:08.959671 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.959621 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5240d464-6fd9-4f8a-819f-0385f4314995-tmp-dir\") pod \"dns-default-bgn7x\" (UID: \"5240d464-6fd9-4f8a-819f-0385f4314995\") " pod="openshift-dns/dns-default-bgn7x" Apr 23 17:59:08.959848 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.959833 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5240d464-6fd9-4f8a-819f-0385f4314995-config-volume\") pod \"dns-default-bgn7x\" (UID: \"5240d464-6fd9-4f8a-819f-0385f4314995\") " pod="openshift-dns/dns-default-bgn7x" Apr 23 17:59:08.971104 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.971080 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsr7q\" (UniqueName: \"kubernetes.io/projected/5240d464-6fd9-4f8a-819f-0385f4314995-kube-api-access-nsr7q\") pod \"dns-default-bgn7x\" (UID: \"5240d464-6fd9-4f8a-819f-0385f4314995\") " pod="openshift-dns/dns-default-bgn7x" Apr 23 17:59:08.972927 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:08.972912 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5xjn\" (UniqueName: \"kubernetes.io/projected/ff148188-17a2-4b88-a857-ae14164f4a06-kube-api-access-k5xjn\") pod \"ingress-canary-86jl7\" (UID: \"ff148188-17a2-4b88-a857-ae14164f4a06\") " pod="openshift-ingress-canary/ingress-canary-86jl7" Apr 23 17:59:09.463368 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:09.463327 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert\") pod \"ingress-canary-86jl7\" (UID: \"ff148188-17a2-4b88-a857-ae14164f4a06\") " pod="openshift-ingress-canary/ingress-canary-86jl7" Apr 23 17:59:09.463800 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:09.463377 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:59:09.463800 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:09.463394 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls\") pod \"dns-default-bgn7x\" (UID: \"5240d464-6fd9-4f8a-819f-0385f4314995\") " pod="openshift-dns/dns-default-bgn7x" Apr 23 17:59:09.463800 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:09.463448 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert podName:ff148188-17a2-4b88-a857-ae14164f4a06 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:10.463427922 +0000 UTC m=+34.697169830 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert") pod "ingress-canary-86jl7" (UID: "ff148188-17a2-4b88-a857-ae14164f4a06") : secret "canary-serving-cert" not found Apr 23 17:59:09.463800 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:09.463484 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:59:09.463800 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:09.463528 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls podName:5240d464-6fd9-4f8a-819f-0385f4314995 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:10.463518012 +0000 UTC m=+34.697259905 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls") pod "dns-default-bgn7x" (UID: "5240d464-6fd9-4f8a-819f-0385f4314995") : secret "dns-default-metrics-tls" not found Apr 23 17:59:09.512460 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:09.512427 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8d8vh" event={"ID":"c04a156a-dd80-4859-a932-b0e25e9bce6b","Type":"ContainerStarted","Data":"7fd891d32cbb0294c53f25d560b10f889f95299b414d9c9a5adf88b5b1f1c2ba"} Apr 23 17:59:10.066624 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:10.066594 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs\") pod \"network-metrics-daemon-xwp2q\" (UID: \"c5673cab-427f-416d-a4ba-94ac7c29dc9c\") " pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:59:10.066786 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:10.066700 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:59:10.066786 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:10.066752 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs podName:c5673cab-427f-416d-a4ba-94ac7c29dc9c nodeName:}" failed. No retries permitted until 2026-04-23 17:59:42.066737591 +0000 UTC m=+66.300479484 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs") pod "network-metrics-daemon-xwp2q" (UID: "c5673cab-427f-416d-a4ba-94ac7c29dc9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:59:10.167759 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:10.167730 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wn2h\" (UniqueName: \"kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h\") pod \"network-check-target-gtd8z\" (UID: \"979ab58c-b655-4aab-94f9-8920472712df\") " pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:59:10.167873 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:10.167860 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:59:10.167913 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:10.167876 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:59:10.167913 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:10.167886 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6wn2h for pod openshift-network-diagnostics/network-check-target-gtd8z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:59:10.167983 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:10.167931 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h podName:979ab58c-b655-4aab-94f9-8920472712df nodeName:}" failed. No retries permitted until 2026-04-23 17:59:42.167916779 +0000 UTC m=+66.401658677 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-6wn2h" (UniqueName: "kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h") pod "network-check-target-gtd8z" (UID: "979ab58c-b655-4aab-94f9-8920472712df") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:59:10.353340 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:10.353273 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:59:10.353466 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:10.353274 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:59:10.353466 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:10.353274 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:59:10.363776 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:10.363757 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 17:59:10.363893 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:10.363806 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 17:59:10.364479 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:10.364448 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 17:59:10.364479 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:10.364475 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 17:59:10.364654 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:10.364563 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qjtm5\"" Apr 23 17:59:10.365046 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:10.365021 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7tv5t\"" Apr 23 17:59:10.470056 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:10.470029 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert\") pod \"ingress-canary-86jl7\" (UID: \"ff148188-17a2-4b88-a857-ae14164f4a06\") " pod="openshift-ingress-canary/ingress-canary-86jl7" Apr 23 17:59:10.470411 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:10.470063 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls\") pod \"dns-default-bgn7x\" (UID: \"5240d464-6fd9-4f8a-819f-0385f4314995\") " pod="openshift-dns/dns-default-bgn7x" Apr 23 17:59:10.470411 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:10.470170 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:59:10.470411 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:10.470177 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:59:10.470411 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:10.470226 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls podName:5240d464-6fd9-4f8a-819f-0385f4314995 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:12.470210792 +0000 UTC m=+36.703952685 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls") pod "dns-default-bgn7x" (UID: "5240d464-6fd9-4f8a-819f-0385f4314995") : secret "dns-default-metrics-tls" not found Apr 23 17:59:10.470411 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:10.470239 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert podName:ff148188-17a2-4b88-a857-ae14164f4a06 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:12.470233021 +0000 UTC m=+36.703974914 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert") pod "ingress-canary-86jl7" (UID: "ff148188-17a2-4b88-a857-ae14164f4a06") : secret "canary-serving-cert" not found Apr 23 17:59:10.515997 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:10.515969 2578 generic.go:358] "Generic (PLEG): container finished" podID="c04a156a-dd80-4859-a932-b0e25e9bce6b" containerID="7fd891d32cbb0294c53f25d560b10f889f95299b414d9c9a5adf88b5b1f1c2ba" exitCode=0 Apr 23 17:59:10.516119 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:10.516008 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8d8vh" event={"ID":"c04a156a-dd80-4859-a932-b0e25e9bce6b","Type":"ContainerDied","Data":"7fd891d32cbb0294c53f25d560b10f889f95299b414d9c9a5adf88b5b1f1c2ba"} Apr 23 17:59:11.074665 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:11.074582 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c32531ab-73c0-4407-990e-7be86cd675cc-original-pull-secret\") pod \"global-pull-secret-syncer-wm4q6\" (UID: \"c32531ab-73c0-4407-990e-7be86cd675cc\") " pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:59:11.077759 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:11.077731 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c32531ab-73c0-4407-990e-7be86cd675cc-original-pull-secret\") pod \"global-pull-secret-syncer-wm4q6\" (UID: \"c32531ab-73c0-4407-990e-7be86cd675cc\") " pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:59:11.263375 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:11.263210 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wm4q6" Apr 23 17:59:11.403610 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:11.403578 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wm4q6"] Apr 23 17:59:11.408225 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:59:11.408194 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc32531ab_73c0_4407_990e_7be86cd675cc.slice/crio-39b38880ca8978ca75d7ef25dde33dd2aac937213f41d7c45998c3d5511a4b46 WatchSource:0}: Error finding container 39b38880ca8978ca75d7ef25dde33dd2aac937213f41d7c45998c3d5511a4b46: Status 404 returned error can't find the container with id 39b38880ca8978ca75d7ef25dde33dd2aac937213f41d7c45998c3d5511a4b46 Apr 23 17:59:11.520724 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:11.520686 2578 generic.go:358] "Generic (PLEG): container finished" podID="c04a156a-dd80-4859-a932-b0e25e9bce6b" containerID="61f022219f09c3415089b5f6767d38e924e9049cccfd80b4490fde3e5371fca6" exitCode=0 Apr 23 17:59:11.521374 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:11.520753 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8d8vh" event={"ID":"c04a156a-dd80-4859-a932-b0e25e9bce6b","Type":"ContainerDied","Data":"61f022219f09c3415089b5f6767d38e924e9049cccfd80b4490fde3e5371fca6"} Apr 23 17:59:11.521837 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:11.521818 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wm4q6" event={"ID":"c32531ab-73c0-4407-990e-7be86cd675cc","Type":"ContainerStarted","Data":"39b38880ca8978ca75d7ef25dde33dd2aac937213f41d7c45998c3d5511a4b46"} Apr 23 17:59:12.487198 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:12.486967 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert\") pod \"ingress-canary-86jl7\" (UID: \"ff148188-17a2-4b88-a857-ae14164f4a06\") " pod="openshift-ingress-canary/ingress-canary-86jl7" Apr 23 17:59:12.487473 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:12.487229 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls\") pod \"dns-default-bgn7x\" (UID: \"5240d464-6fd9-4f8a-819f-0385f4314995\") " pod="openshift-dns/dns-default-bgn7x" Apr 23 17:59:12.487473 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:12.487133 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:59:12.487473 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:12.487352 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert podName:ff148188-17a2-4b88-a857-ae14164f4a06 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:16.487329044 +0000 UTC m=+40.721070943 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert") pod "ingress-canary-86jl7" (UID: "ff148188-17a2-4b88-a857-ae14164f4a06") : secret "canary-serving-cert" not found Apr 23 17:59:12.487473 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:12.487406 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:59:12.487473 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:12.487456 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls podName:5240d464-6fd9-4f8a-819f-0385f4314995 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:16.487444401 +0000 UTC m=+40.721186307 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls") pod "dns-default-bgn7x" (UID: "5240d464-6fd9-4f8a-819f-0385f4314995") : secret "dns-default-metrics-tls" not found Apr 23 17:59:12.528065 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:12.527988 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8d8vh" event={"ID":"c04a156a-dd80-4859-a932-b0e25e9bce6b","Type":"ContainerStarted","Data":"6b8797d972d115cd8f8e154c67575941e6ed5b7555f93dbd2aafaed40b394ae1"} Apr 23 17:59:12.556961 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:12.556863 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-8d8vh" podStartSLOduration=6.291891408 podStartE2EDuration="36.556841223s" podCreationTimestamp="2026-04-23 17:58:36 +0000 UTC" firstStartedPulling="2026-04-23 17:58:38.975548421 +0000 UTC m=+3.209290314" lastFinishedPulling="2026-04-23 17:59:09.240498224 +0000 UTC m=+33.474240129" observedRunningTime="2026-04-23 17:59:12.555284737 +0000 UTC m=+36.789026655" watchObservedRunningTime="2026-04-23 17:59:12.556841223 +0000 UTC m=+36.790583136" Apr 23 17:59:15.534759 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:15.534677 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wm4q6" event={"ID":"c32531ab-73c0-4407-990e-7be86cd675cc","Type":"ContainerStarted","Data":"6f111a2baf10fc3711475ceb8d611ff4bce7ac794d0268089c29790bd8765ea9"} Apr 23 17:59:15.552084 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:15.551976 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-wm4q6" podStartSLOduration=32.749447048 podStartE2EDuration="36.551961115s" podCreationTimestamp="2026-04-23 17:58:39 +0000 UTC" firstStartedPulling="2026-04-23 17:59:11.409678057 +0000 UTC m=+35.643419951" lastFinishedPulling="2026-04-23 17:59:15.212192121 +0000 UTC m=+39.445934018" observedRunningTime="2026-04-23 17:59:15.55156171 +0000 UTC m=+39.785303623" watchObservedRunningTime="2026-04-23 17:59:15.551961115 +0000 UTC m=+39.785703030" Apr 23 17:59:16.522226 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:16.522186 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert\") pod \"ingress-canary-86jl7\" (UID: \"ff148188-17a2-4b88-a857-ae14164f4a06\") " pod="openshift-ingress-canary/ingress-canary-86jl7" Apr 23 17:59:16.522226 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:16.522229 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls\") pod \"dns-default-bgn7x\" (UID: \"5240d464-6fd9-4f8a-819f-0385f4314995\") " pod="openshift-dns/dns-default-bgn7x" Apr 23 17:59:16.522436 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:16.522341 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:59:16.522436 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:16.522341 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:59:16.522436 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:16.522390 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls podName:5240d464-6fd9-4f8a-819f-0385f4314995 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:24.522378052 +0000 UTC m=+48.756119945 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls") pod "dns-default-bgn7x" (UID: "5240d464-6fd9-4f8a-819f-0385f4314995") : secret "dns-default-metrics-tls" not found Apr 23 17:59:16.522436 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:16.522402 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert podName:ff148188-17a2-4b88-a857-ae14164f4a06 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:24.522396437 +0000 UTC m=+48.756138330 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert") pod "ingress-canary-86jl7" (UID: "ff148188-17a2-4b88-a857-ae14164f4a06") : secret "canary-serving-cert" not found Apr 23 17:59:24.577111 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:24.577062 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert\") pod \"ingress-canary-86jl7\" (UID: \"ff148188-17a2-4b88-a857-ae14164f4a06\") " pod="openshift-ingress-canary/ingress-canary-86jl7" Apr 23 17:59:24.577111 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:24.577107 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls\") pod \"dns-default-bgn7x\" (UID: \"5240d464-6fd9-4f8a-819f-0385f4314995\") " pod="openshift-dns/dns-default-bgn7x" Apr 23 17:59:24.577502 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:24.577209 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:59:24.577502 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:24.577214 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:59:24.577502 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:24.577258 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls podName:5240d464-6fd9-4f8a-819f-0385f4314995 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:40.577244793 +0000 UTC m=+64.810986686 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls") pod "dns-default-bgn7x" (UID: "5240d464-6fd9-4f8a-819f-0385f4314995") : secret "dns-default-metrics-tls" not found Apr 23 17:59:24.577502 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:24.577271 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert podName:ff148188-17a2-4b88-a857-ae14164f4a06 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:40.577265919 +0000 UTC m=+64.811007812 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert") pod "ingress-canary-86jl7" (UID: "ff148188-17a2-4b88-a857-ae14164f4a06") : secret "canary-serving-cert" not found Apr 23 17:59:32.535857 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:32.535821 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-978cv" Apr 23 17:59:33.060049 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:33.060017 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn"] Apr 23 17:59:33.088881 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:33.088841 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn"] Apr 23 17:59:33.089054 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:33.088997 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn" Apr 23 17:59:33.091081 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:33.091060 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 17:59:33.091206 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:33.091061 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 17:59:33.091206 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:33.091060 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 17:59:33.091206 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:33.091061 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 23 17:59:33.239104 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:33.239068 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d9795312-09b0-4528-8cad-f3cbc488baab-tmp\") pod \"klusterlet-addon-workmgr-5d887f9654-zvxxn\" (UID: \"d9795312-09b0-4528-8cad-f3cbc488baab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn" Apr 23 17:59:33.239283 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:33.239146 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d9795312-09b0-4528-8cad-f3cbc488baab-klusterlet-config\") pod \"klusterlet-addon-workmgr-5d887f9654-zvxxn\" (UID: \"d9795312-09b0-4528-8cad-f3cbc488baab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn" Apr 23 17:59:33.239283 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:33.239176 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs6pw\" (UniqueName: \"kubernetes.io/projected/d9795312-09b0-4528-8cad-f3cbc488baab-kube-api-access-xs6pw\") pod \"klusterlet-addon-workmgr-5d887f9654-zvxxn\" (UID: \"d9795312-09b0-4528-8cad-f3cbc488baab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn" Apr 23 17:59:33.339993 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:33.339904 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d9795312-09b0-4528-8cad-f3cbc488baab-klusterlet-config\") pod \"klusterlet-addon-workmgr-5d887f9654-zvxxn\" (UID: \"d9795312-09b0-4528-8cad-f3cbc488baab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn" Apr 23 17:59:33.339993 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:33.339949 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xs6pw\" (UniqueName: \"kubernetes.io/projected/d9795312-09b0-4528-8cad-f3cbc488baab-kube-api-access-xs6pw\") pod \"klusterlet-addon-workmgr-5d887f9654-zvxxn\" (UID: \"d9795312-09b0-4528-8cad-f3cbc488baab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn" Apr 23 17:59:33.340163 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:33.339997 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d9795312-09b0-4528-8cad-f3cbc488baab-tmp\") pod \"klusterlet-addon-workmgr-5d887f9654-zvxxn\" (UID: \"d9795312-09b0-4528-8cad-f3cbc488baab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn" Apr 23 17:59:33.340441 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:33.340422 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d9795312-09b0-4528-8cad-f3cbc488baab-tmp\") pod \"klusterlet-addon-workmgr-5d887f9654-zvxxn\" (UID: \"d9795312-09b0-4528-8cad-f3cbc488baab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn" Apr 23 17:59:33.343512 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:33.343478 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d9795312-09b0-4528-8cad-f3cbc488baab-klusterlet-config\") pod \"klusterlet-addon-workmgr-5d887f9654-zvxxn\" (UID: \"d9795312-09b0-4528-8cad-f3cbc488baab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn" Apr 23 17:59:33.348706 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:33.348684 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs6pw\" (UniqueName: \"kubernetes.io/projected/d9795312-09b0-4528-8cad-f3cbc488baab-kube-api-access-xs6pw\") pod \"klusterlet-addon-workmgr-5d887f9654-zvxxn\" (UID: \"d9795312-09b0-4528-8cad-f3cbc488baab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn" Apr 23 17:59:33.397751 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:33.397722 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn" Apr 23 17:59:33.525596 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:33.525565 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn"] Apr 23 17:59:33.569325 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:33.569294 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn" event={"ID":"d9795312-09b0-4528-8cad-f3cbc488baab","Type":"ContainerStarted","Data":"23556a4ed1907164a8b935bc28024ea7b8f87fac2ba49cf9e3c914d4754f4d87"} Apr 23 17:59:37.579245 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:37.579207 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn" event={"ID":"d9795312-09b0-4528-8cad-f3cbc488baab","Type":"ContainerStarted","Data":"00ded0364468d315b74d395b8ff1d53ea6f17e34b2f6ded2fd26dd368b201a2e"} Apr 23 17:59:37.579700 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:37.579366 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn" Apr 23 17:59:37.580933 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:37.580914 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn" Apr 23 17:59:37.596715 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:37.596671 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn" podStartSLOduration=0.979599252 podStartE2EDuration="4.596660139s" podCreationTimestamp="2026-04-23 17:59:33 +0000 UTC" firstStartedPulling="2026-04-23 17:59:33.531813548 +0000 UTC m=+57.765555445" lastFinishedPulling="2026-04-23 17:59:37.148874436 +0000 UTC m=+61.382616332" observedRunningTime="2026-04-23 17:59:37.596053335 +0000 UTC m=+61.829795272" watchObservedRunningTime="2026-04-23 17:59:37.596660139 +0000 UTC m=+61.830402094" Apr 23 17:59:40.593694 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:40.593655 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert\") pod \"ingress-canary-86jl7\" (UID: \"ff148188-17a2-4b88-a857-ae14164f4a06\") " pod="openshift-ingress-canary/ingress-canary-86jl7" Apr 23 17:59:40.593694 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:40.593693 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls\") pod \"dns-default-bgn7x\" (UID: \"5240d464-6fd9-4f8a-819f-0385f4314995\") " pod="openshift-dns/dns-default-bgn7x" Apr 23 17:59:40.594406 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:40.593787 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:59:40.594406 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:40.593789 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:59:40.594406 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:40.593845 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls podName:5240d464-6fd9-4f8a-819f-0385f4314995 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:12.59383151 +0000 UTC m=+96.827573404 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls") pod "dns-default-bgn7x" (UID: "5240d464-6fd9-4f8a-819f-0385f4314995") : secret "dns-default-metrics-tls" not found Apr 23 17:59:40.594406 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:40.593858 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert podName:ff148188-17a2-4b88-a857-ae14164f4a06 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:12.59385206 +0000 UTC m=+96.827593953 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert") pod "ingress-canary-86jl7" (UID: "ff148188-17a2-4b88-a857-ae14164f4a06") : secret "canary-serving-cert" not found Apr 23 17:59:42.103297 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:42.103260 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs\") pod \"network-metrics-daemon-xwp2q\" (UID: \"c5673cab-427f-416d-a4ba-94ac7c29dc9c\") " pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 17:59:42.105572 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:42.105551 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 17:59:42.114312 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:42.114298 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 17:59:42.114376 ip-10-0-143-63 kubenswrapper[2578]: E0423 17:59:42.114366 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs podName:c5673cab-427f-416d-a4ba-94ac7c29dc9c nodeName:}" failed. No retries permitted until 2026-04-23 18:00:46.114349308 +0000 UTC m=+130.348091201 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs") pod "network-metrics-daemon-xwp2q" (UID: "c5673cab-427f-416d-a4ba-94ac7c29dc9c") : secret "metrics-daemon-secret" not found Apr 23 17:59:42.204369 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:42.204335 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wn2h\" (UniqueName: \"kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h\") pod \"network-check-target-gtd8z\" (UID: \"979ab58c-b655-4aab-94f9-8920472712df\") " pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:59:42.206975 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:42.206955 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 17:59:42.216697 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:42.216682 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 17:59:42.227370 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:42.227349 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wn2h\" (UniqueName: \"kubernetes.io/projected/979ab58c-b655-4aab-94f9-8920472712df-kube-api-access-6wn2h\") pod \"network-check-target-gtd8z\" (UID: \"979ab58c-b655-4aab-94f9-8920472712df\") " pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:59:42.478261 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:42.478182 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qjtm5\"" Apr 23 17:59:42.486703 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:42.486681 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:59:42.597263 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:42.597235 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-gtd8z"] Apr 23 17:59:42.601048 ip-10-0-143-63 kubenswrapper[2578]: W0423 17:59:42.601020 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod979ab58c_b655_4aab_94f9_8920472712df.slice/crio-33f8dfca4b911f975f3f7af6afc4feb95baf770dcd8780192365fce0d71ec761 WatchSource:0}: Error finding container 33f8dfca4b911f975f3f7af6afc4feb95baf770dcd8780192365fce0d71ec761: Status 404 returned error can't find the container with id 33f8dfca4b911f975f3f7af6afc4feb95baf770dcd8780192365fce0d71ec761 Apr 23 17:59:43.591316 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:43.591285 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-gtd8z" event={"ID":"979ab58c-b655-4aab-94f9-8920472712df","Type":"ContainerStarted","Data":"33f8dfca4b911f975f3f7af6afc4feb95baf770dcd8780192365fce0d71ec761"} Apr 23 17:59:45.596978 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:45.596947 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-gtd8z" event={"ID":"979ab58c-b655-4aab-94f9-8920472712df","Type":"ContainerStarted","Data":"1b217c35745d997671fae4c3ef1320ee657cbd73ba77cfd1b3ae14ce9266ebe5"} Apr 23 17:59:45.597331 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:45.597060 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 17:59:45.612419 ip-10-0-143-63 kubenswrapper[2578]: I0423 17:59:45.612376 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-gtd8z" podStartSLOduration=67.030959135 podStartE2EDuration="1m9.612363895s" podCreationTimestamp="2026-04-23 17:58:36 +0000 UTC" firstStartedPulling="2026-04-23 17:59:42.603165896 +0000 UTC m=+66.836907789" lastFinishedPulling="2026-04-23 17:59:45.184570654 +0000 UTC m=+69.418312549" observedRunningTime="2026-04-23 17:59:45.611507044 +0000 UTC m=+69.845248971" watchObservedRunningTime="2026-04-23 17:59:45.612363895 +0000 UTC m=+69.846105809" Apr 23 18:00:12.623313 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:12.623269 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert\") pod \"ingress-canary-86jl7\" (UID: \"ff148188-17a2-4b88-a857-ae14164f4a06\") " pod="openshift-ingress-canary/ingress-canary-86jl7" Apr 23 18:00:12.623313 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:12.623311 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls\") pod \"dns-default-bgn7x\" (UID: \"5240d464-6fd9-4f8a-819f-0385f4314995\") " pod="openshift-dns/dns-default-bgn7x" Apr 23 18:00:12.623841 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:12.623408 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 18:00:12.623841 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:12.623422 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 18:00:12.623841 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:12.623460 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls podName:5240d464-6fd9-4f8a-819f-0385f4314995 nodeName:}" failed. No retries permitted until 2026-04-23 18:01:16.623445443 +0000 UTC m=+160.857187337 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls") pod "dns-default-bgn7x" (UID: "5240d464-6fd9-4f8a-819f-0385f4314995") : secret "dns-default-metrics-tls" not found Apr 23 18:00:12.623841 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:12.623500 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert podName:ff148188-17a2-4b88-a857-ae14164f4a06 nodeName:}" failed. No retries permitted until 2026-04-23 18:01:16.623482724 +0000 UTC m=+160.857224619 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert") pod "ingress-canary-86jl7" (UID: "ff148188-17a2-4b88-a857-ae14164f4a06") : secret "canary-serving-cert" not found Apr 23 18:00:16.601374 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:16.601341 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-gtd8z" Apr 23 18:00:36.624882 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.624847 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5jh6m"] Apr 23 18:00:36.626877 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.626859 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5jh6m" Apr 23 18:00:36.629367 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.629344 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 23 18:00:36.629803 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.629786 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 23 18:00:36.629862 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.629823 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-bzd86\"" Apr 23 18:00:36.635083 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.635058 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5jh6m"] Apr 23 18:00:36.695307 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.695273 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64cxf\" (UniqueName: \"kubernetes.io/projected/f49a4cd8-3fa6-403a-b269-d6e4ed51f7c4-kube-api-access-64cxf\") pod \"volume-data-source-validator-7c6cbb6c87-5jh6m\" (UID: \"f49a4cd8-3fa6-403a-b269-d6e4ed51f7c4\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5jh6m" Apr 23 18:00:36.733933 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.733877 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-d447l"] Apr 23 18:00:36.735727 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.735710 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-776bb79f5c-2r8bg"] Apr 23 18:00:36.735869 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.735852 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-d447l" Apr 23 18:00:36.737490 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.737470 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.737921 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.737901 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 23 18:00:36.737998 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.737902 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 23 18:00:36.738206 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.738188 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 23 18:00:36.738729 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.738713 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 23 18:00:36.738799 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.738720 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-9ksm2\"" Apr 23 18:00:36.739502 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.739488 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 18:00:36.739798 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.739781 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 18:00:36.739963 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.739948 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jxfcc\"" Apr 23 18:00:36.740026 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.739968 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 18:00:36.744873 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.744847 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 23 18:00:36.746369 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.746342 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 18:00:36.746833 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.746810 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-d447l"] Apr 23 18:00:36.747872 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.747852 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-776bb79f5c-2r8bg"] Apr 23 18:00:36.795780 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.795745 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f66b8f-5bbb-4d67-9dfd-cd24fb73773b-config\") pod \"console-operator-9d4b6777b-d447l\" (UID: \"49f66b8f-5bbb-4d67-9dfd-cd24fb73773b\") " pod="openshift-console-operator/console-operator-9d4b6777b-d447l" Apr 23 18:00:36.795780 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.795786 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49f66b8f-5bbb-4d67-9dfd-cd24fb73773b-serving-cert\") pod \"console-operator-9d4b6777b-d447l\" (UID: \"49f66b8f-5bbb-4d67-9dfd-cd24fb73773b\") " pod="openshift-console-operator/console-operator-9d4b6777b-d447l" Apr 23 18:00:36.796072 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.795820 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64cxf\" (UniqueName: \"kubernetes.io/projected/f49a4cd8-3fa6-403a-b269-d6e4ed51f7c4-kube-api-access-64cxf\") pod \"volume-data-source-validator-7c6cbb6c87-5jh6m\" (UID: \"f49a4cd8-3fa6-403a-b269-d6e4ed51f7c4\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5jh6m" Apr 23 18:00:36.796072 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.795839 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sp5q\" (UniqueName: \"kubernetes.io/projected/49f66b8f-5bbb-4d67-9dfd-cd24fb73773b-kube-api-access-5sp5q\") pod \"console-operator-9d4b6777b-d447l\" (UID: \"49f66b8f-5bbb-4d67-9dfd-cd24fb73773b\") " pod="openshift-console-operator/console-operator-9d4b6777b-d447l" Apr 23 18:00:36.796072 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.795924 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49f66b8f-5bbb-4d67-9dfd-cd24fb73773b-trusted-ca\") pod \"console-operator-9d4b6777b-d447l\" (UID: \"49f66b8f-5bbb-4d67-9dfd-cd24fb73773b\") " pod="openshift-console-operator/console-operator-9d4b6777b-d447l" Apr 23 18:00:36.796072 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.795974 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d5c381f1-454c-4290-9c44-d067a94c399b-image-registry-private-configuration\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.796072 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.796002 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-registry-tls\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.796072 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.796033 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d5c381f1-454c-4290-9c44-d067a94c399b-registry-certificates\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.796072 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.796060 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d5c381f1-454c-4290-9c44-d067a94c399b-trusted-ca\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.796332 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.796108 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d5c381f1-454c-4290-9c44-d067a94c399b-ca-trust-extracted\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.796332 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.796147 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-bound-sa-token\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.796332 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.796207 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d5c381f1-454c-4290-9c44-d067a94c399b-installation-pull-secrets\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.796332 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.796224 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvzvn\" (UniqueName: \"kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-kube-api-access-nvzvn\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.803445 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.803412 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64cxf\" (UniqueName: \"kubernetes.io/projected/f49a4cd8-3fa6-403a-b269-d6e4ed51f7c4-kube-api-access-64cxf\") pod \"volume-data-source-validator-7c6cbb6c87-5jh6m\" (UID: \"f49a4cd8-3fa6-403a-b269-d6e4ed51f7c4\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5jh6m" Apr 23 18:00:36.837715 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.837680 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qhb84"] Apr 23 18:00:36.840619 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.840597 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qhb84" Apr 23 18:00:36.841478 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.841454 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-64c9b47658-qqqmr"] Apr 23 18:00:36.843196 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.843169 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-xfl9g"] Apr 23 18:00:36.843323 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.843305 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:36.844203 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.844185 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 23 18:00:36.844299 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.844220 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 23 18:00:36.844509 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.844443 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 23 18:00:36.844509 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.844491 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 23 18:00:36.845125 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.845109 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xfl9g" Apr 23 18:00:36.845480 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.845457 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-j95wt\"" Apr 23 18:00:36.845758 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.845743 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 23 18:00:36.845802 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.845755 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 23 18:00:36.845802 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.845744 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 18:00:36.845882 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.845758 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-27ct2\"" Apr 23 18:00:36.845882 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.845758 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 23 18:00:36.846092 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.846046 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 23 18:00:36.846999 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.846973 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-njrxp\"" Apr 23 18:00:36.848006 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.847989 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 18:00:36.850318 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.850294 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qhb84"] Apr 23 18:00:36.856867 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.856845 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-64c9b47658-qqqmr"] Apr 23 18:00:36.867460 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.867434 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-xfl9g"] Apr 23 18:00:36.896985 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.896889 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d5c381f1-454c-4290-9c44-d067a94c399b-ca-trust-extracted\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.896985 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.896925 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-bound-sa-token\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.896985 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.896951 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d5c381f1-454c-4290-9c44-d067a94c399b-installation-pull-secrets\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.896985 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.896969 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvzvn\" (UniqueName: \"kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-kube-api-access-nvzvn\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.897328 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.897035 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f66b8f-5bbb-4d67-9dfd-cd24fb73773b-config\") pod \"console-operator-9d4b6777b-d447l\" (UID: \"49f66b8f-5bbb-4d67-9dfd-cd24fb73773b\") " pod="openshift-console-operator/console-operator-9d4b6777b-d447l" Apr 23 18:00:36.897328 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.897070 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49f66b8f-5bbb-4d67-9dfd-cd24fb73773b-serving-cert\") pod \"console-operator-9d4b6777b-d447l\" (UID: \"49f66b8f-5bbb-4d67-9dfd-cd24fb73773b\") " pod="openshift-console-operator/console-operator-9d4b6777b-d447l" Apr 23 18:00:36.897328 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.897106 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5sp5q\" (UniqueName: \"kubernetes.io/projected/49f66b8f-5bbb-4d67-9dfd-cd24fb73773b-kube-api-access-5sp5q\") pod \"console-operator-9d4b6777b-d447l\" (UID: \"49f66b8f-5bbb-4d67-9dfd-cd24fb73773b\") " pod="openshift-console-operator/console-operator-9d4b6777b-d447l" Apr 23 18:00:36.897328 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.897158 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49f66b8f-5bbb-4d67-9dfd-cd24fb73773b-trusted-ca\") pod \"console-operator-9d4b6777b-d447l\" (UID: \"49f66b8f-5bbb-4d67-9dfd-cd24fb73773b\") " pod="openshift-console-operator/console-operator-9d4b6777b-d447l" Apr 23 18:00:36.897328 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.897189 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d5c381f1-454c-4290-9c44-d067a94c399b-image-registry-private-configuration\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.897328 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.897214 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-registry-tls\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.897328 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.897245 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d5c381f1-454c-4290-9c44-d067a94c399b-registry-certificates\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.897328 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.897272 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d5c381f1-454c-4290-9c44-d067a94c399b-trusted-ca\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.897328 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.897303 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d5c381f1-454c-4290-9c44-d067a94c399b-ca-trust-extracted\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.897846 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:36.897395 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 18:00:36.897846 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:36.897409 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-776bb79f5c-2r8bg: secret "image-registry-tls" not found Apr 23 18:00:36.897846 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:36.897463 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-registry-tls podName:d5c381f1-454c-4290-9c44-d067a94c399b nodeName:}" failed. No retries permitted until 2026-04-23 18:00:37.39744278 +0000 UTC m=+121.631184690 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-registry-tls") pod "image-registry-776bb79f5c-2r8bg" (UID: "d5c381f1-454c-4290-9c44-d067a94c399b") : secret "image-registry-tls" not found Apr 23 18:00:36.898004 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.897910 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f66b8f-5bbb-4d67-9dfd-cd24fb73773b-config\") pod \"console-operator-9d4b6777b-d447l\" (UID: \"49f66b8f-5bbb-4d67-9dfd-cd24fb73773b\") " pod="openshift-console-operator/console-operator-9d4b6777b-d447l" Apr 23 18:00:36.898243 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.898216 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49f66b8f-5bbb-4d67-9dfd-cd24fb73773b-trusted-ca\") pod \"console-operator-9d4b6777b-d447l\" (UID: \"49f66b8f-5bbb-4d67-9dfd-cd24fb73773b\") " pod="openshift-console-operator/console-operator-9d4b6777b-d447l" Apr 23 18:00:36.898317 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.898260 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d5c381f1-454c-4290-9c44-d067a94c399b-registry-certificates\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.898317 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.898224 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d5c381f1-454c-4290-9c44-d067a94c399b-trusted-ca\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.899710 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.899689 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49f66b8f-5bbb-4d67-9dfd-cd24fb73773b-serving-cert\") pod \"console-operator-9d4b6777b-d447l\" (UID: \"49f66b8f-5bbb-4d67-9dfd-cd24fb73773b\") " pod="openshift-console-operator/console-operator-9d4b6777b-d447l" Apr 23 18:00:36.899815 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.899762 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d5c381f1-454c-4290-9c44-d067a94c399b-installation-pull-secrets\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.900067 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.900052 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d5c381f1-454c-4290-9c44-d067a94c399b-image-registry-private-configuration\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.905494 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.905460 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-bound-sa-token\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.905729 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.905707 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvzvn\" (UniqueName: \"kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-kube-api-access-nvzvn\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:36.905915 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.905897 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sp5q\" (UniqueName: \"kubernetes.io/projected/49f66b8f-5bbb-4d67-9dfd-cd24fb73773b-kube-api-access-5sp5q\") pod \"console-operator-9d4b6777b-d447l\" (UID: \"49f66b8f-5bbb-4d67-9dfd-cd24fb73773b\") " pod="openshift-console-operator/console-operator-9d4b6777b-d447l" Apr 23 18:00:36.936580 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.936524 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5jh6m" Apr 23 18:00:36.997843 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.997719 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/778d33bd-ade8-4471-a0d0-10670f14a624-service-ca-bundle\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:36.997843 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.997763 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-stats-auth\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:36.997843 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.997804 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0af64194-8451-4345-9044-583d24fa444c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qhb84\" (UID: \"0af64194-8451-4345-9044-583d24fa444c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qhb84" Apr 23 18:00:36.998147 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.997856 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-metrics-certs\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:36.998147 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.997888 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0af64194-8451-4345-9044-583d24fa444c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qhb84\" (UID: \"0af64194-8451-4345-9044-583d24fa444c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qhb84" Apr 23 18:00:36.998147 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.998006 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-default-certificate\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:36.998147 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.998052 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvtws\" (UniqueName: \"kubernetes.io/projected/c0426005-0ecf-4d42-aaec-e90027db197e-kube-api-access-bvtws\") pod \"network-check-source-8894fc9bd-xfl9g\" (UID: \"c0426005-0ecf-4d42-aaec-e90027db197e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xfl9g" Apr 23 18:00:36.998573 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.998372 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc7kh\" (UniqueName: \"kubernetes.io/projected/778d33bd-ade8-4471-a0d0-10670f14a624-kube-api-access-wc7kh\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:36.998573 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:36.998473 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cr4c\" (UniqueName: \"kubernetes.io/projected/0af64194-8451-4345-9044-583d24fa444c-kube-api-access-8cr4c\") pod \"kube-storage-version-migrator-operator-6769c5d45-qhb84\" (UID: \"0af64194-8451-4345-9044-583d24fa444c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qhb84" Apr 23 18:00:37.048662 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.048617 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-d447l" Apr 23 18:00:37.052738 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.052707 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5jh6m"] Apr 23 18:00:37.055868 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:00:37.055836 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf49a4cd8_3fa6_403a_b269_d6e4ed51f7c4.slice/crio-16d1352755074ba9495d501bbb7cea8881db65f3fe9ae72e2c16c04b2db173f1 WatchSource:0}: Error finding container 16d1352755074ba9495d501bbb7cea8881db65f3fe9ae72e2c16c04b2db173f1: Status 404 returned error can't find the container with id 16d1352755074ba9495d501bbb7cea8881db65f3fe9ae72e2c16c04b2db173f1 Apr 23 18:00:37.099796 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.099719 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0af64194-8451-4345-9044-583d24fa444c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qhb84\" (UID: \"0af64194-8451-4345-9044-583d24fa444c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qhb84" Apr 23 18:00:37.100041 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.099983 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-default-certificate\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:37.100389 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.100133 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvtws\" (UniqueName: \"kubernetes.io/projected/c0426005-0ecf-4d42-aaec-e90027db197e-kube-api-access-bvtws\") pod \"network-check-source-8894fc9bd-xfl9g\" (UID: \"c0426005-0ecf-4d42-aaec-e90027db197e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xfl9g" Apr 23 18:00:37.100389 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.100193 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wc7kh\" (UniqueName: \"kubernetes.io/projected/778d33bd-ade8-4471-a0d0-10670f14a624-kube-api-access-wc7kh\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:37.100389 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.100282 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8cr4c\" (UniqueName: \"kubernetes.io/projected/0af64194-8451-4345-9044-583d24fa444c-kube-api-access-8cr4c\") pod \"kube-storage-version-migrator-operator-6769c5d45-qhb84\" (UID: \"0af64194-8451-4345-9044-583d24fa444c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qhb84" Apr 23 18:00:37.100389 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.100322 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/778d33bd-ade8-4471-a0d0-10670f14a624-service-ca-bundle\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:37.100389 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.100346 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-stats-auth\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:37.100389 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.100382 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0af64194-8451-4345-9044-583d24fa444c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qhb84\" (UID: \"0af64194-8451-4345-9044-583d24fa444c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qhb84" Apr 23 18:00:37.100852 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.100425 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-metrics-certs\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:37.100852 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:37.100628 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 18:00:37.100852 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.100676 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0af64194-8451-4345-9044-583d24fa444c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qhb84\" (UID: \"0af64194-8451-4345-9044-583d24fa444c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qhb84" Apr 23 18:00:37.100852 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:37.100694 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-metrics-certs podName:778d33bd-ade8-4471-a0d0-10670f14a624 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:37.600676136 +0000 UTC m=+121.834418044 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-metrics-certs") pod "router-default-64c9b47658-qqqmr" (UID: "778d33bd-ade8-4471-a0d0-10670f14a624") : secret "router-metrics-certs-default" not found Apr 23 18:00:37.101079 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:37.100991 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/778d33bd-ade8-4471-a0d0-10670f14a624-service-ca-bundle podName:778d33bd-ade8-4471-a0d0-10670f14a624 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:37.600972421 +0000 UTC m=+121.834714335 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/778d33bd-ade8-4471-a0d0-10670f14a624-service-ca-bundle") pod "router-default-64c9b47658-qqqmr" (UID: "778d33bd-ade8-4471-a0d0-10670f14a624") : configmap references non-existent config key: service-ca.crt Apr 23 18:00:37.103088 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.103031 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0af64194-8451-4345-9044-583d24fa444c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qhb84\" (UID: \"0af64194-8451-4345-9044-583d24fa444c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qhb84" Apr 23 18:00:37.104798 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.103493 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-stats-auth\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:37.104798 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.103493 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-default-certificate\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:37.109132 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.109101 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cr4c\" (UniqueName: \"kubernetes.io/projected/0af64194-8451-4345-9044-583d24fa444c-kube-api-access-8cr4c\") pod \"kube-storage-version-migrator-operator-6769c5d45-qhb84\" (UID: \"0af64194-8451-4345-9044-583d24fa444c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qhb84" Apr 23 18:00:37.109659 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.109635 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc7kh\" (UniqueName: \"kubernetes.io/projected/778d33bd-ade8-4471-a0d0-10670f14a624-kube-api-access-wc7kh\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:37.109810 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.109788 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvtws\" (UniqueName: \"kubernetes.io/projected/c0426005-0ecf-4d42-aaec-e90027db197e-kube-api-access-bvtws\") pod \"network-check-source-8894fc9bd-xfl9g\" (UID: \"c0426005-0ecf-4d42-aaec-e90027db197e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xfl9g" Apr 23 18:00:37.153224 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.153144 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qhb84" Apr 23 18:00:37.165114 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.165083 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xfl9g" Apr 23 18:00:37.167972 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.167926 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-d447l"] Apr 23 18:00:37.172987 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:00:37.172962 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49f66b8f_5bbb_4d67_9dfd_cd24fb73773b.slice/crio-db0511b6b2aa83abc789d3d233a21bd1541e6ae0dc0df152bf93e19ee2a1208e WatchSource:0}: Error finding container db0511b6b2aa83abc789d3d233a21bd1541e6ae0dc0df152bf93e19ee2a1208e: Status 404 returned error can't find the container with id db0511b6b2aa83abc789d3d233a21bd1541e6ae0dc0df152bf93e19ee2a1208e Apr 23 18:00:37.288232 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.288201 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qhb84"] Apr 23 18:00:37.292184 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:00:37.292137 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0af64194_8451_4345_9044_583d24fa444c.slice/crio-c9d58ebd37b138c9da441e61769fc1b0d46d3ec53b78f36834d8746187693f9f WatchSource:0}: Error finding container c9d58ebd37b138c9da441e61769fc1b0d46d3ec53b78f36834d8746187693f9f: Status 404 returned error can't find the container with id c9d58ebd37b138c9da441e61769fc1b0d46d3ec53b78f36834d8746187693f9f Apr 23 18:00:37.305482 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.305445 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-xfl9g"] Apr 23 18:00:37.308460 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:00:37.308427 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0426005_0ecf_4d42_aaec_e90027db197e.slice/crio-f481a553954c49a215a84fdcdacb483f71628af8145c75df804a9a42a9597268 WatchSource:0}: Error finding container f481a553954c49a215a84fdcdacb483f71628af8145c75df804a9a42a9597268: Status 404 returned error can't find the container with id f481a553954c49a215a84fdcdacb483f71628af8145c75df804a9a42a9597268 Apr 23 18:00:37.404194 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.404099 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-registry-tls\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:37.404351 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:37.404246 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 18:00:37.404351 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:37.404268 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-776bb79f5c-2r8bg: secret "image-registry-tls" not found Apr 23 18:00:37.404351 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:37.404324 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-registry-tls podName:d5c381f1-454c-4290-9c44-d067a94c399b nodeName:}" failed. No retries permitted until 2026-04-23 18:00:38.404309465 +0000 UTC m=+122.638051363 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-registry-tls") pod "image-registry-776bb79f5c-2r8bg" (UID: "d5c381f1-454c-4290-9c44-d067a94c399b") : secret "image-registry-tls" not found Apr 23 18:00:37.605402 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.605357 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/778d33bd-ade8-4471-a0d0-10670f14a624-service-ca-bundle\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:37.605579 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.605424 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-metrics-certs\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:37.605579 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:37.605556 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/778d33bd-ade8-4471-a0d0-10670f14a624-service-ca-bundle podName:778d33bd-ade8-4471-a0d0-10670f14a624 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:38.605512886 +0000 UTC m=+122.839254797 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/778d33bd-ade8-4471-a0d0-10670f14a624-service-ca-bundle") pod "router-default-64c9b47658-qqqmr" (UID: "778d33bd-ade8-4471-a0d0-10670f14a624") : configmap references non-existent config key: service-ca.crt Apr 23 18:00:37.605579 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:37.605555 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 18:00:37.605711 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:37.605614 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-metrics-certs podName:778d33bd-ade8-4471-a0d0-10670f14a624 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:38.605602999 +0000 UTC m=+122.839344896 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-metrics-certs") pod "router-default-64c9b47658-qqqmr" (UID: "778d33bd-ade8-4471-a0d0-10670f14a624") : secret "router-metrics-certs-default" not found Apr 23 18:00:37.696090 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.695993 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qhb84" event={"ID":"0af64194-8451-4345-9044-583d24fa444c","Type":"ContainerStarted","Data":"c9d58ebd37b138c9da441e61769fc1b0d46d3ec53b78f36834d8746187693f9f"} Apr 23 18:00:37.697251 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.697219 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5jh6m" event={"ID":"f49a4cd8-3fa6-403a-b269-d6e4ed51f7c4","Type":"ContainerStarted","Data":"16d1352755074ba9495d501bbb7cea8881db65f3fe9ae72e2c16c04b2db173f1"} Apr 23 18:00:37.698765 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.698737 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xfl9g" event={"ID":"c0426005-0ecf-4d42-aaec-e90027db197e","Type":"ContainerStarted","Data":"d420a9dd2b4e40cce54fd0b14caf511c1e2cf718b2b4c378722696cabd9f831e"} Apr 23 18:00:37.698883 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.698768 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xfl9g" event={"ID":"c0426005-0ecf-4d42-aaec-e90027db197e","Type":"ContainerStarted","Data":"f481a553954c49a215a84fdcdacb483f71628af8145c75df804a9a42a9597268"} Apr 23 18:00:37.699900 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.699866 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-d447l" event={"ID":"49f66b8f-5bbb-4d67-9dfd-cd24fb73773b","Type":"ContainerStarted","Data":"db0511b6b2aa83abc789d3d233a21bd1541e6ae0dc0df152bf93e19ee2a1208e"} Apr 23 18:00:37.714068 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:37.713978 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xfl9g" podStartSLOduration=1.7139582949999999 podStartE2EDuration="1.713958295s" podCreationTimestamp="2026-04-23 18:00:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:00:37.713148561 +0000 UTC m=+121.946890477" watchObservedRunningTime="2026-04-23 18:00:37.713958295 +0000 UTC m=+121.947700211" Apr 23 18:00:38.414054 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:38.413877 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-registry-tls\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:38.414270 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:38.414067 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 18:00:38.414270 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:38.414092 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-776bb79f5c-2r8bg: secret "image-registry-tls" not found Apr 23 18:00:38.414270 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:38.414167 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-registry-tls podName:d5c381f1-454c-4290-9c44-d067a94c399b nodeName:}" failed. No retries permitted until 2026-04-23 18:00:40.414144575 +0000 UTC m=+124.647886486 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-registry-tls") pod "image-registry-776bb79f5c-2r8bg" (UID: "d5c381f1-454c-4290-9c44-d067a94c399b") : secret "image-registry-tls" not found Apr 23 18:00:38.617296 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:38.616397 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-metrics-certs\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:38.617296 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:38.616585 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/778d33bd-ade8-4471-a0d0-10670f14a624-service-ca-bundle\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:38.617296 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:38.616755 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/778d33bd-ade8-4471-a0d0-10670f14a624-service-ca-bundle podName:778d33bd-ade8-4471-a0d0-10670f14a624 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:40.61673633 +0000 UTC m=+124.850478246 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/778d33bd-ade8-4471-a0d0-10670f14a624-service-ca-bundle") pod "router-default-64c9b47658-qqqmr" (UID: "778d33bd-ade8-4471-a0d0-10670f14a624") : configmap references non-existent config key: service-ca.crt Apr 23 18:00:38.617296 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:38.617193 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 18:00:38.617296 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:38.617250 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-metrics-certs podName:778d33bd-ade8-4471-a0d0-10670f14a624 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:40.617231829 +0000 UTC m=+124.850973737 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-metrics-certs") pod "router-default-64c9b47658-qqqmr" (UID: "778d33bd-ade8-4471-a0d0-10670f14a624") : secret "router-metrics-certs-default" not found Apr 23 18:00:40.434202 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:40.434166 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-registry-tls\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:40.434618 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:40.434293 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 18:00:40.434618 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:40.434307 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-776bb79f5c-2r8bg: secret "image-registry-tls" not found Apr 23 18:00:40.434618 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:40.434361 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-registry-tls podName:d5c381f1-454c-4290-9c44-d067a94c399b nodeName:}" failed. No retries permitted until 2026-04-23 18:00:44.434344649 +0000 UTC m=+128.668086543 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-registry-tls") pod "image-registry-776bb79f5c-2r8bg" (UID: "d5c381f1-454c-4290-9c44-d067a94c399b") : secret "image-registry-tls" not found Apr 23 18:00:40.636842 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:40.636800 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/778d33bd-ade8-4471-a0d0-10670f14a624-service-ca-bundle\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:40.636994 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:40.636860 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-metrics-certs\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:40.636994 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:40.636964 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 18:00:40.637056 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:40.637005 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/778d33bd-ade8-4471-a0d0-10670f14a624-service-ca-bundle podName:778d33bd-ade8-4471-a0d0-10670f14a624 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:44.636982809 +0000 UTC m=+128.870724729 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/778d33bd-ade8-4471-a0d0-10670f14a624-service-ca-bundle") pod "router-default-64c9b47658-qqqmr" (UID: "778d33bd-ade8-4471-a0d0-10670f14a624") : configmap references non-existent config key: service-ca.crt Apr 23 18:00:40.637056 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:40.637040 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-metrics-certs podName:778d33bd-ade8-4471-a0d0-10670f14a624 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:44.637027145 +0000 UTC m=+128.870769041 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-metrics-certs") pod "router-default-64c9b47658-qqqmr" (UID: "778d33bd-ade8-4471-a0d0-10670f14a624") : secret "router-metrics-certs-default" not found Apr 23 18:00:40.711891 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:40.711798 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qhb84" event={"ID":"0af64194-8451-4345-9044-583d24fa444c","Type":"ContainerStarted","Data":"a2d1be072539cfeb1b99eff743b9bc64d4def1cb082df824489528256686d710"} Apr 23 18:00:40.713291 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:40.713256 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5jh6m" event={"ID":"f49a4cd8-3fa6-403a-b269-d6e4ed51f7c4","Type":"ContainerStarted","Data":"937429065fc40c96be447b7839df7099175d2967e7697e55b2f053e2c8702936"} Apr 23 18:00:40.714711 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:40.714691 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/0.log" Apr 23 18:00:40.714841 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:40.714725 2578 generic.go:358] "Generic (PLEG): container finished" podID="49f66b8f-5bbb-4d67-9dfd-cd24fb73773b" containerID="d0ef67e6ac88644ea992e37b166b6c72fd7340983ed6eb568acd3a74fa014a74" exitCode=255 Apr 23 18:00:40.714841 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:40.714753 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-d447l" event={"ID":"49f66b8f-5bbb-4d67-9dfd-cd24fb73773b","Type":"ContainerDied","Data":"d0ef67e6ac88644ea992e37b166b6c72fd7340983ed6eb568acd3a74fa014a74"} Apr 23 18:00:40.714983 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:40.714968 2578 scope.go:117] "RemoveContainer" containerID="d0ef67e6ac88644ea992e37b166b6c72fd7340983ed6eb568acd3a74fa014a74" Apr 23 18:00:40.728376 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:40.728317 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qhb84" podStartSLOduration=2.324589362 podStartE2EDuration="4.728298512s" podCreationTimestamp="2026-04-23 18:00:36 +0000 UTC" firstStartedPulling="2026-04-23 18:00:37.294093988 +0000 UTC m=+121.527835881" lastFinishedPulling="2026-04-23 18:00:39.697803122 +0000 UTC m=+123.931545031" observedRunningTime="2026-04-23 18:00:40.727646049 +0000 UTC m=+124.961387965" watchObservedRunningTime="2026-04-23 18:00:40.728298512 +0000 UTC m=+124.962040431" Apr 23 18:00:40.742819 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:40.742767 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5jh6m" podStartSLOduration=3.102834342 podStartE2EDuration="4.7427474s" podCreationTimestamp="2026-04-23 18:00:36 +0000 UTC" firstStartedPulling="2026-04-23 18:00:37.057694247 +0000 UTC m=+121.291436141" lastFinishedPulling="2026-04-23 18:00:38.697607302 +0000 UTC m=+122.931349199" observedRunningTime="2026-04-23 18:00:40.741391412 +0000 UTC m=+124.975133325" watchObservedRunningTime="2026-04-23 18:00:40.7427474 +0000 UTC m=+124.976489315" Apr 23 18:00:41.718872 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:41.718839 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:00:41.719305 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:41.719246 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/0.log" Apr 23 18:00:41.719305 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:41.719280 2578 generic.go:358] "Generic (PLEG): container finished" podID="49f66b8f-5bbb-4d67-9dfd-cd24fb73773b" containerID="0056eedf0388f35b191d11941e3ae449dda42a8ff5d1ac503bf0b9756db9abe7" exitCode=255 Apr 23 18:00:41.719399 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:41.719374 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-d447l" event={"ID":"49f66b8f-5bbb-4d67-9dfd-cd24fb73773b","Type":"ContainerDied","Data":"0056eedf0388f35b191d11941e3ae449dda42a8ff5d1ac503bf0b9756db9abe7"} Apr 23 18:00:41.719432 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:41.719425 2578 scope.go:117] "RemoveContainer" containerID="d0ef67e6ac88644ea992e37b166b6c72fd7340983ed6eb568acd3a74fa014a74" Apr 23 18:00:41.719761 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:41.719735 2578 scope.go:117] "RemoveContainer" containerID="0056eedf0388f35b191d11941e3ae449dda42a8ff5d1ac503bf0b9756db9abe7" Apr 23 18:00:41.719944 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:41.719924 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-d447l_openshift-console-operator(49f66b8f-5bbb-4d67-9dfd-cd24fb73773b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-d447l" podUID="49f66b8f-5bbb-4d67-9dfd-cd24fb73773b" Apr 23 18:00:42.723192 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:42.723157 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:00:42.723823 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:42.723634 2578 scope.go:117] "RemoveContainer" containerID="0056eedf0388f35b191d11941e3ae449dda42a8ff5d1ac503bf0b9756db9abe7" Apr 23 18:00:42.723910 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:42.723865 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-d447l_openshift-console-operator(49f66b8f-5bbb-4d67-9dfd-cd24fb73773b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-d447l" podUID="49f66b8f-5bbb-4d67-9dfd-cd24fb73773b" Apr 23 18:00:44.468160 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:44.468121 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-registry-tls\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:44.468680 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:44.468290 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 18:00:44.468680 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:44.468313 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-776bb79f5c-2r8bg: secret "image-registry-tls" not found Apr 23 18:00:44.468680 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:44.468388 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-registry-tls podName:d5c381f1-454c-4290-9c44-d067a94c399b nodeName:}" failed. No retries permitted until 2026-04-23 18:00:52.468364579 +0000 UTC m=+136.702106487 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-registry-tls") pod "image-registry-776bb79f5c-2r8bg" (UID: "d5c381f1-454c-4290-9c44-d067a94c399b") : secret "image-registry-tls" not found Apr 23 18:00:44.670135 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:44.670090 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/778d33bd-ade8-4471-a0d0-10670f14a624-service-ca-bundle\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:44.670307 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:44.670161 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-metrics-certs\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:44.670307 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:44.670261 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 18:00:44.670307 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:44.670268 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/778d33bd-ade8-4471-a0d0-10670f14a624-service-ca-bundle podName:778d33bd-ade8-4471-a0d0-10670f14a624 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:52.670249864 +0000 UTC m=+136.903991774 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/778d33bd-ade8-4471-a0d0-10670f14a624-service-ca-bundle") pod "router-default-64c9b47658-qqqmr" (UID: "778d33bd-ade8-4471-a0d0-10670f14a624") : configmap references non-existent config key: service-ca.crt Apr 23 18:00:44.670413 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:44.670326 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-metrics-certs podName:778d33bd-ade8-4471-a0d0-10670f14a624 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:52.670313711 +0000 UTC m=+136.904055606 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-metrics-certs") pod "router-default-64c9b47658-qqqmr" (UID: "778d33bd-ade8-4471-a0d0-10670f14a624") : secret "router-metrics-certs-default" not found Apr 23 18:00:44.708708 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:44.708680 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hl5qq_d8d9c074-5a2a-4898-b910-f1a16ffc62fc/dns-node-resolver/0.log" Apr 23 18:00:45.913386 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:45.913350 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-l4wm6_22a39804-db9b-4a6b-a927-b5f0bb1d22eb/node-ca/0.log" Apr 23 18:00:46.181352 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:46.181250 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs\") pod \"network-metrics-daemon-xwp2q\" (UID: \"c5673cab-427f-416d-a4ba-94ac7c29dc9c\") " pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 18:00:46.181522 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:46.181428 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 18:00:46.181522 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:46.181509 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs podName:c5673cab-427f-416d-a4ba-94ac7c29dc9c nodeName:}" failed. No retries permitted until 2026-04-23 18:02:48.181488708 +0000 UTC m=+252.415230616 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs") pod "network-metrics-daemon-xwp2q" (UID: "c5673cab-427f-416d-a4ba-94ac7c29dc9c") : secret "metrics-daemon-secret" not found Apr 23 18:00:47.049683 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:47.049626 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-d447l" Apr 23 18:00:47.049683 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:47.049689 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-d447l" Apr 23 18:00:47.050124 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:47.050050 2578 scope.go:117] "RemoveContainer" containerID="0056eedf0388f35b191d11941e3ae449dda42a8ff5d1ac503bf0b9756db9abe7" Apr 23 18:00:47.050241 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:00:47.050222 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-d447l_openshift-console-operator(49f66b8f-5bbb-4d67-9dfd-cd24fb73773b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-d447l" podUID="49f66b8f-5bbb-4d67-9dfd-cd24fb73773b" Apr 23 18:00:47.312340 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:47.312262 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-qhb84_0af64194-8451-4345-9044-583d24fa444c/kube-storage-version-migrator-operator/0.log" Apr 23 18:00:52.535410 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:52.535373 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-registry-tls\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:52.537853 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:52.537815 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-registry-tls\") pod \"image-registry-776bb79f5c-2r8bg\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:52.653562 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:52.653517 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:52.737082 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:52.737047 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/778d33bd-ade8-4471-a0d0-10670f14a624-service-ca-bundle\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:52.737291 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:52.737102 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-metrics-certs\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:52.737694 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:52.737671 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/778d33bd-ade8-4471-a0d0-10670f14a624-service-ca-bundle\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:52.739497 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:52.739469 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/778d33bd-ade8-4471-a0d0-10670f14a624-metrics-certs\") pod \"router-default-64c9b47658-qqqmr\" (UID: \"778d33bd-ade8-4471-a0d0-10670f14a624\") " pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:52.760240 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:52.760207 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:52.778481 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:52.778397 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-776bb79f5c-2r8bg"] Apr 23 18:00:52.781696 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:00:52.781660 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5c381f1_454c_4290_9c44_d067a94c399b.slice/crio-8ab8a6c237783ecdb01791c0e321c08bb9297777012a70d35a9b58dfe391c44c WatchSource:0}: Error finding container 8ab8a6c237783ecdb01791c0e321c08bb9297777012a70d35a9b58dfe391c44c: Status 404 returned error can't find the container with id 8ab8a6c237783ecdb01791c0e321c08bb9297777012a70d35a9b58dfe391c44c Apr 23 18:00:52.890158 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:52.890115 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-64c9b47658-qqqmr"] Apr 23 18:00:52.893247 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:00:52.893218 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod778d33bd_ade8_4471_a0d0_10670f14a624.slice/crio-5ef430bc4f58bb59d32e0d2ee11a2327117cef2e5376c7f6db8673a3636ad0a7 WatchSource:0}: Error finding container 5ef430bc4f58bb59d32e0d2ee11a2327117cef2e5376c7f6db8673a3636ad0a7: Status 404 returned error can't find the container with id 5ef430bc4f58bb59d32e0d2ee11a2327117cef2e5376c7f6db8673a3636ad0a7 Apr 23 18:00:53.751837 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:53.751806 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-64c9b47658-qqqmr" event={"ID":"778d33bd-ade8-4471-a0d0-10670f14a624","Type":"ContainerStarted","Data":"daa4921c1bcf0110337d106b1b9b0453ca600afa18a95ccc56c0538e3af8881a"} Apr 23 18:00:53.752268 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:53.751845 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-64c9b47658-qqqmr" event={"ID":"778d33bd-ade8-4471-a0d0-10670f14a624","Type":"ContainerStarted","Data":"5ef430bc4f58bb59d32e0d2ee11a2327117cef2e5376c7f6db8673a3636ad0a7"} Apr 23 18:00:53.753174 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:53.753150 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" event={"ID":"d5c381f1-454c-4290-9c44-d067a94c399b","Type":"ContainerStarted","Data":"8464a195c30b2f4fd6971056314b331e2622bc1451c38246fe0eabf7ba1bb2bd"} Apr 23 18:00:53.753276 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:53.753180 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" event={"ID":"d5c381f1-454c-4290-9c44-d067a94c399b","Type":"ContainerStarted","Data":"8ab8a6c237783ecdb01791c0e321c08bb9297777012a70d35a9b58dfe391c44c"} Apr 23 18:00:53.753276 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:53.753257 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:00:53.761316 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:53.761295 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:53.763956 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:53.763935 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:53.770368 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:53.770327 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-64c9b47658-qqqmr" podStartSLOduration=17.770313253 podStartE2EDuration="17.770313253s" podCreationTimestamp="2026-04-23 18:00:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:00:53.768724933 +0000 UTC m=+138.002466849" watchObservedRunningTime="2026-04-23 18:00:53.770313253 +0000 UTC m=+138.004055168" Apr 23 18:00:53.789388 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:53.789342 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" podStartSLOduration=17.789327939 podStartE2EDuration="17.789327939s" podCreationTimestamp="2026-04-23 18:00:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:00:53.788946033 +0000 UTC m=+138.022687955" watchObservedRunningTime="2026-04-23 18:00:53.789327939 +0000 UTC m=+138.023069855" Apr 23 18:00:54.756083 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:54.756044 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:54.757373 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:54.757349 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-64c9b47658-qqqmr" Apr 23 18:00:58.353969 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:58.353931 2578 scope.go:117] "RemoveContainer" containerID="0056eedf0388f35b191d11941e3ae449dda42a8ff5d1ac503bf0b9756db9abe7" Apr 23 18:00:58.768525 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:58.768501 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:00:58.768718 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:58.768593 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-d447l" event={"ID":"49f66b8f-5bbb-4d67-9dfd-cd24fb73773b","Type":"ContainerStarted","Data":"8b0d80743a38cbd482afbf552331f602d89907ef74e549d412e85a063ed83ea8"} Apr 23 18:00:58.768890 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:58.768866 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-d447l" Apr 23 18:00:58.777334 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:58.777299 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-d447l" Apr 23 18:00:58.790845 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:00:58.790789 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-d447l" podStartSLOduration=20.269668291 podStartE2EDuration="22.790774696s" podCreationTimestamp="2026-04-23 18:00:36 +0000 UTC" firstStartedPulling="2026-04-23 18:00:37.175968092 +0000 UTC m=+121.409710000" lastFinishedPulling="2026-04-23 18:00:39.697074508 +0000 UTC m=+123.930816405" observedRunningTime="2026-04-23 18:00:58.790079754 +0000 UTC m=+143.023821680" watchObservedRunningTime="2026-04-23 18:00:58.790774696 +0000 UTC m=+143.024516611" Apr 23 18:01:04.800693 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.800664 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8mlfw"] Apr 23 18:01:04.803623 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.803601 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8mlfw" Apr 23 18:01:04.807810 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.807791 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 18:01:04.808456 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.808440 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 18:01:04.808666 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.808645 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rmsv4\"" Apr 23 18:01:04.808748 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.808655 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 18:01:04.808806 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.808747 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 18:01:04.815005 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.814985 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8mlfw"] Apr 23 18:01:04.831071 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.831040 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-776bb79f5c-2r8bg"] Apr 23 18:01:04.833043 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.833021 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ea634ca7-4a3e-497f-a8d4-a4443b1dcf50-crio-socket\") pod \"insights-runtime-extractor-8mlfw\" (UID: \"ea634ca7-4a3e-497f-a8d4-a4443b1dcf50\") " pod="openshift-insights/insights-runtime-extractor-8mlfw" Apr 23 18:01:04.833147 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.833071 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w494\" (UniqueName: \"kubernetes.io/projected/ea634ca7-4a3e-497f-a8d4-a4443b1dcf50-kube-api-access-5w494\") pod \"insights-runtime-extractor-8mlfw\" (UID: \"ea634ca7-4a3e-497f-a8d4-a4443b1dcf50\") " pod="openshift-insights/insights-runtime-extractor-8mlfw" Apr 23 18:01:04.833147 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.833097 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ea634ca7-4a3e-497f-a8d4-a4443b1dcf50-data-volume\") pod \"insights-runtime-extractor-8mlfw\" (UID: \"ea634ca7-4a3e-497f-a8d4-a4443b1dcf50\") " pod="openshift-insights/insights-runtime-extractor-8mlfw" Apr 23 18:01:04.833220 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.833177 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ea634ca7-4a3e-497f-a8d4-a4443b1dcf50-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8mlfw\" (UID: \"ea634ca7-4a3e-497f-a8d4-a4443b1dcf50\") " pod="openshift-insights/insights-runtime-extractor-8mlfw" Apr 23 18:01:04.833220 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.833203 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ea634ca7-4a3e-497f-a8d4-a4443b1dcf50-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8mlfw\" (UID: \"ea634ca7-4a3e-497f-a8d4-a4443b1dcf50\") " pod="openshift-insights/insights-runtime-extractor-8mlfw" Apr 23 18:01:04.934579 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.934525 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5w494\" (UniqueName: \"kubernetes.io/projected/ea634ca7-4a3e-497f-a8d4-a4443b1dcf50-kube-api-access-5w494\") pod \"insights-runtime-extractor-8mlfw\" (UID: \"ea634ca7-4a3e-497f-a8d4-a4443b1dcf50\") " pod="openshift-insights/insights-runtime-extractor-8mlfw" Apr 23 18:01:04.934579 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.934584 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ea634ca7-4a3e-497f-a8d4-a4443b1dcf50-data-volume\") pod \"insights-runtime-extractor-8mlfw\" (UID: \"ea634ca7-4a3e-497f-a8d4-a4443b1dcf50\") " pod="openshift-insights/insights-runtime-extractor-8mlfw" Apr 23 18:01:04.934811 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.934638 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ea634ca7-4a3e-497f-a8d4-a4443b1dcf50-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8mlfw\" (UID: \"ea634ca7-4a3e-497f-a8d4-a4443b1dcf50\") " pod="openshift-insights/insights-runtime-extractor-8mlfw" Apr 23 18:01:04.934811 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.934667 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ea634ca7-4a3e-497f-a8d4-a4443b1dcf50-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8mlfw\" (UID: \"ea634ca7-4a3e-497f-a8d4-a4443b1dcf50\") " pod="openshift-insights/insights-runtime-extractor-8mlfw" Apr 23 18:01:04.934811 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.934760 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ea634ca7-4a3e-497f-a8d4-a4443b1dcf50-crio-socket\") pod \"insights-runtime-extractor-8mlfw\" (UID: \"ea634ca7-4a3e-497f-a8d4-a4443b1dcf50\") " pod="openshift-insights/insights-runtime-extractor-8mlfw" Apr 23 18:01:04.934932 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.934834 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ea634ca7-4a3e-497f-a8d4-a4443b1dcf50-crio-socket\") pod \"insights-runtime-extractor-8mlfw\" (UID: \"ea634ca7-4a3e-497f-a8d4-a4443b1dcf50\") " pod="openshift-insights/insights-runtime-extractor-8mlfw" Apr 23 18:01:04.935080 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.935058 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ea634ca7-4a3e-497f-a8d4-a4443b1dcf50-data-volume\") pod \"insights-runtime-extractor-8mlfw\" (UID: \"ea634ca7-4a3e-497f-a8d4-a4443b1dcf50\") " pod="openshift-insights/insights-runtime-extractor-8mlfw" Apr 23 18:01:04.935270 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.935254 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ea634ca7-4a3e-497f-a8d4-a4443b1dcf50-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8mlfw\" (UID: \"ea634ca7-4a3e-497f-a8d4-a4443b1dcf50\") " pod="openshift-insights/insights-runtime-extractor-8mlfw" Apr 23 18:01:04.936917 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.936901 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ea634ca7-4a3e-497f-a8d4-a4443b1dcf50-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8mlfw\" (UID: \"ea634ca7-4a3e-497f-a8d4-a4443b1dcf50\") " pod="openshift-insights/insights-runtime-extractor-8mlfw" Apr 23 18:01:04.942782 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:04.942764 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w494\" (UniqueName: \"kubernetes.io/projected/ea634ca7-4a3e-497f-a8d4-a4443b1dcf50-kube-api-access-5w494\") pod \"insights-runtime-extractor-8mlfw\" (UID: \"ea634ca7-4a3e-497f-a8d4-a4443b1dcf50\") " pod="openshift-insights/insights-runtime-extractor-8mlfw" Apr 23 18:01:05.112483 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:05.112399 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8mlfw" Apr 23 18:01:05.238214 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:05.238185 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8mlfw"] Apr 23 18:01:05.241205 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:01:05.241176 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea634ca7_4a3e_497f_a8d4_a4443b1dcf50.slice/crio-17e33e75c802510bb45030e34a788725189598f86aae72f6db1c7696372b0a73 WatchSource:0}: Error finding container 17e33e75c802510bb45030e34a788725189598f86aae72f6db1c7696372b0a73: Status 404 returned error can't find the container with id 17e33e75c802510bb45030e34a788725189598f86aae72f6db1c7696372b0a73 Apr 23 18:01:05.786843 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:05.786801 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8mlfw" event={"ID":"ea634ca7-4a3e-497f-a8d4-a4443b1dcf50","Type":"ContainerStarted","Data":"40abdf4d271273d5a1586e0e7286091a87e61325702d4cc55f6aed001b6ef135"} Apr 23 18:01:05.786843 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:05.786852 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8mlfw" event={"ID":"ea634ca7-4a3e-497f-a8d4-a4443b1dcf50","Type":"ContainerStarted","Data":"17e33e75c802510bb45030e34a788725189598f86aae72f6db1c7696372b0a73"} Apr 23 18:01:06.791756 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:06.791716 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8mlfw" event={"ID":"ea634ca7-4a3e-497f-a8d4-a4443b1dcf50","Type":"ContainerStarted","Data":"48b05c8f82a0b186cc10d36d37a25893eefdeaaf159fd1d0d0a69db0f75003dc"} Apr 23 18:01:07.796098 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:07.796061 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8mlfw" event={"ID":"ea634ca7-4a3e-497f-a8d4-a4443b1dcf50","Type":"ContainerStarted","Data":"e3ede37e5bc94bcb78b143bc4a711a8073bbfc0ac356a3285fdfd794db2bcf35"} Apr 23 18:01:07.814294 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:07.814249 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8mlfw" podStartSLOduration=1.8966424229999999 podStartE2EDuration="3.814235363s" podCreationTimestamp="2026-04-23 18:01:04 +0000 UTC" firstStartedPulling="2026-04-23 18:01:05.290108042 +0000 UTC m=+149.523849939" lastFinishedPulling="2026-04-23 18:01:07.207700972 +0000 UTC m=+151.441442879" observedRunningTime="2026-04-23 18:01:07.81320557 +0000 UTC m=+152.046947510" watchObservedRunningTime="2026-04-23 18:01:07.814235363 +0000 UTC m=+152.047977325" Apr 23 18:01:11.722553 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:01:11.722489 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-bgn7x" podUID="5240d464-6fd9-4f8a-819f-0385f4314995" Apr 23 18:01:11.736270 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:01:11.736231 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-86jl7" podUID="ff148188-17a2-4b88-a857-ae14164f4a06" Apr 23 18:01:11.810294 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:11.810267 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bgn7x" Apr 23 18:01:11.904627 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:11.904592 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vsxkq"] Apr 23 18:01:11.907257 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:11.907237 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vsxkq" Apr 23 18:01:11.909037 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:11.909018 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 23 18:01:11.909742 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:11.909722 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-6twbx\"" Apr 23 18:01:11.916123 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:11.916102 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vsxkq"] Apr 23 18:01:11.988361 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:11.988287 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/33a54c72-a227-4426-b514-e41232818756-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-vsxkq\" (UID: \"33a54c72-a227-4426-b514-e41232818756\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vsxkq" Apr 23 18:01:12.088670 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:12.088623 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/33a54c72-a227-4426-b514-e41232818756-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-vsxkq\" (UID: \"33a54c72-a227-4426-b514-e41232818756\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vsxkq" Apr 23 18:01:12.088808 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:01:12.088759 2578 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 23 18:01:12.088854 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:01:12.088828 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33a54c72-a227-4426-b514-e41232818756-tls-certificates podName:33a54c72-a227-4426-b514-e41232818756 nodeName:}" failed. No retries permitted until 2026-04-23 18:01:12.588814079 +0000 UTC m=+156.822555974 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/33a54c72-a227-4426-b514-e41232818756-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-vsxkq" (UID: "33a54c72-a227-4426-b514-e41232818756") : secret "prometheus-operator-admission-webhook-tls" not found Apr 23 18:01:12.593368 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:12.593339 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/33a54c72-a227-4426-b514-e41232818756-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-vsxkq\" (UID: \"33a54c72-a227-4426-b514-e41232818756\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vsxkq" Apr 23 18:01:12.595758 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:12.595727 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/33a54c72-a227-4426-b514-e41232818756-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-vsxkq\" (UID: \"33a54c72-a227-4426-b514-e41232818756\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vsxkq" Apr 23 18:01:12.816501 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:12.816467 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vsxkq" Apr 23 18:01:12.932075 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:12.932047 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vsxkq"] Apr 23 18:01:12.935041 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:01:12.935016 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33a54c72_a227_4426_b514_e41232818756.slice/crio-f2cf1aee206069fac0ecad62ef14b2f58c5578ce25cb3ba419bbb1f40f2e3d3a WatchSource:0}: Error finding container f2cf1aee206069fac0ecad62ef14b2f58c5578ce25cb3ba419bbb1f40f2e3d3a: Status 404 returned error can't find the container with id f2cf1aee206069fac0ecad62ef14b2f58c5578ce25cb3ba419bbb1f40f2e3d3a Apr 23 18:01:13.369229 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:01:13.369173 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-xwp2q" podUID="c5673cab-427f-416d-a4ba-94ac7c29dc9c" Apr 23 18:01:13.816377 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:13.816337 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vsxkq" event={"ID":"33a54c72-a227-4426-b514-e41232818756","Type":"ContainerStarted","Data":"f2cf1aee206069fac0ecad62ef14b2f58c5578ce25cb3ba419bbb1f40f2e3d3a"} Apr 23 18:01:14.819842 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:14.819804 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vsxkq" event={"ID":"33a54c72-a227-4426-b514-e41232818756","Type":"ContainerStarted","Data":"c90fd8882e2480032082e3214349d6bc7966fadfe63b0258cc3f69ce104c513b"} Apr 23 18:01:14.820211 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:14.820046 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vsxkq" Apr 23 18:01:14.824586 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:14.824564 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vsxkq" Apr 23 18:01:14.835253 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:14.835200 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vsxkq" podStartSLOduration=2.797253375 podStartE2EDuration="3.835188212s" podCreationTimestamp="2026-04-23 18:01:11 +0000 UTC" firstStartedPulling="2026-04-23 18:01:12.936697049 +0000 UTC m=+157.170438942" lastFinishedPulling="2026-04-23 18:01:13.974631873 +0000 UTC m=+158.208373779" observedRunningTime="2026-04-23 18:01:14.834240906 +0000 UTC m=+159.067982828" watchObservedRunningTime="2026-04-23 18:01:14.835188212 +0000 UTC m=+159.068930148" Apr 23 18:01:14.837893 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:14.837862 2578 patch_prober.go:28] interesting pod/image-registry-776bb79f5c-2r8bg container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 18:01:14.838181 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:14.838139 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" podUID="d5c381f1-454c-4290-9c44-d067a94c399b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:01:14.969142 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:14.969106 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gzbjc"] Apr 23 18:01:14.971184 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:14.971164 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-gzbjc" Apr 23 18:01:14.973341 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:14.973304 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 18:01:14.973341 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:14.973315 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 18:01:14.973341 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:14.973304 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 23 18:01:14.973569 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:14.973308 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 18:01:14.973569 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:14.973358 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-qjphp\"" Apr 23 18:01:14.973569 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:14.973308 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 23 18:01:14.978937 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:14.978914 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gzbjc"] Apr 23 18:01:15.012852 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:15.012819 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b84d1492-2668-484a-a1a9-0c1404a30918-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gzbjc\" (UID: \"b84d1492-2668-484a-a1a9-0c1404a30918\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gzbjc" Apr 23 18:01:15.013000 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:15.012856 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b84d1492-2668-484a-a1a9-0c1404a30918-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gzbjc\" (UID: \"b84d1492-2668-484a-a1a9-0c1404a30918\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gzbjc" Apr 23 18:01:15.013000 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:15.012967 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gswgg\" (UniqueName: \"kubernetes.io/projected/b84d1492-2668-484a-a1a9-0c1404a30918-kube-api-access-gswgg\") pod \"prometheus-operator-5676c8c784-gzbjc\" (UID: \"b84d1492-2668-484a-a1a9-0c1404a30918\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gzbjc" Apr 23 18:01:15.013000 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:15.012994 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b84d1492-2668-484a-a1a9-0c1404a30918-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gzbjc\" (UID: \"b84d1492-2668-484a-a1a9-0c1404a30918\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gzbjc" Apr 23 18:01:15.113558 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:15.113450 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gswgg\" (UniqueName: \"kubernetes.io/projected/b84d1492-2668-484a-a1a9-0c1404a30918-kube-api-access-gswgg\") pod \"prometheus-operator-5676c8c784-gzbjc\" (UID: \"b84d1492-2668-484a-a1a9-0c1404a30918\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gzbjc" Apr 23 18:01:15.113558 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:15.113492 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b84d1492-2668-484a-a1a9-0c1404a30918-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gzbjc\" (UID: \"b84d1492-2668-484a-a1a9-0c1404a30918\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gzbjc" Apr 23 18:01:15.113558 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:15.113529 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b84d1492-2668-484a-a1a9-0c1404a30918-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gzbjc\" (UID: \"b84d1492-2668-484a-a1a9-0c1404a30918\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gzbjc" Apr 23 18:01:15.113836 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:01:15.113621 2578 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 23 18:01:15.113836 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:15.113680 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b84d1492-2668-484a-a1a9-0c1404a30918-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gzbjc\" (UID: \"b84d1492-2668-484a-a1a9-0c1404a30918\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gzbjc" Apr 23 18:01:15.113836 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:01:15.113697 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b84d1492-2668-484a-a1a9-0c1404a30918-prometheus-operator-tls podName:b84d1492-2668-484a-a1a9-0c1404a30918 nodeName:}" failed. No retries permitted until 2026-04-23 18:01:15.61367734 +0000 UTC m=+159.847419247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/b84d1492-2668-484a-a1a9-0c1404a30918-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-gzbjc" (UID: "b84d1492-2668-484a-a1a9-0c1404a30918") : secret "prometheus-operator-tls" not found Apr 23 18:01:15.114297 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:15.114276 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b84d1492-2668-484a-a1a9-0c1404a30918-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gzbjc\" (UID: \"b84d1492-2668-484a-a1a9-0c1404a30918\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gzbjc" Apr 23 18:01:15.115993 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:15.115965 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b84d1492-2668-484a-a1a9-0c1404a30918-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gzbjc\" (UID: \"b84d1492-2668-484a-a1a9-0c1404a30918\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gzbjc" Apr 23 18:01:15.122256 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:15.122234 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gswgg\" (UniqueName: \"kubernetes.io/projected/b84d1492-2668-484a-a1a9-0c1404a30918-kube-api-access-gswgg\") pod \"prometheus-operator-5676c8c784-gzbjc\" (UID: \"b84d1492-2668-484a-a1a9-0c1404a30918\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gzbjc" Apr 23 18:01:15.617591 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:15.617524 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b84d1492-2668-484a-a1a9-0c1404a30918-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gzbjc\" (UID: \"b84d1492-2668-484a-a1a9-0c1404a30918\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gzbjc" Apr 23 18:01:15.617781 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:01:15.617680 2578 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 23 18:01:15.617781 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:01:15.617750 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b84d1492-2668-484a-a1a9-0c1404a30918-prometheus-operator-tls podName:b84d1492-2668-484a-a1a9-0c1404a30918 nodeName:}" failed. No retries permitted until 2026-04-23 18:01:16.61773542 +0000 UTC m=+160.851477313 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/b84d1492-2668-484a-a1a9-0c1404a30918-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-gzbjc" (UID: "b84d1492-2668-484a-a1a9-0c1404a30918") : secret "prometheus-operator-tls" not found Apr 23 18:01:16.626922 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:16.626875 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls\") pod \"dns-default-bgn7x\" (UID: \"5240d464-6fd9-4f8a-819f-0385f4314995\") " pod="openshift-dns/dns-default-bgn7x" Apr 23 18:01:16.626922 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:16.626927 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b84d1492-2668-484a-a1a9-0c1404a30918-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gzbjc\" (UID: \"b84d1492-2668-484a-a1a9-0c1404a30918\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gzbjc" Apr 23 18:01:16.627371 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:16.626992 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert\") pod \"ingress-canary-86jl7\" (UID: \"ff148188-17a2-4b88-a857-ae14164f4a06\") " pod="openshift-ingress-canary/ingress-canary-86jl7" Apr 23 18:01:16.629491 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:16.629454 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b84d1492-2668-484a-a1a9-0c1404a30918-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gzbjc\" (UID: \"b84d1492-2668-484a-a1a9-0c1404a30918\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gzbjc" Apr 23 18:01:16.629491 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:16.629484 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5240d464-6fd9-4f8a-819f-0385f4314995-metrics-tls\") pod \"dns-default-bgn7x\" (UID: \"5240d464-6fd9-4f8a-819f-0385f4314995\") " pod="openshift-dns/dns-default-bgn7x" Apr 23 18:01:16.629651 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:16.629564 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff148188-17a2-4b88-a857-ae14164f4a06-cert\") pod \"ingress-canary-86jl7\" (UID: \"ff148188-17a2-4b88-a857-ae14164f4a06\") " pod="openshift-ingress-canary/ingress-canary-86jl7" Apr 23 18:01:16.780115 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:16.780077 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-gzbjc" Apr 23 18:01:16.895699 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:16.895622 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gzbjc"] Apr 23 18:01:16.900038 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:01:16.900012 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb84d1492_2668_484a_a1a9_0c1404a30918.slice/crio-6b0d3e2ec00ade91336bb34978a9859b6dd4d592cc989b4a291150ec4d8f4eb0 WatchSource:0}: Error finding container 6b0d3e2ec00ade91336bb34978a9859b6dd4d592cc989b4a291150ec4d8f4eb0: Status 404 returned error can't find the container with id 6b0d3e2ec00ade91336bb34978a9859b6dd4d592cc989b4a291150ec4d8f4eb0 Apr 23 18:01:16.912653 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:16.912624 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-sv5q9\"" Apr 23 18:01:16.921062 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:16.921042 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bgn7x" Apr 23 18:01:17.039183 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:17.039153 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bgn7x"] Apr 23 18:01:17.042005 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:01:17.041971 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5240d464_6fd9_4f8a_819f_0385f4314995.slice/crio-96256b1d14665f77f54fe3aaafca0b5fcf4518289de21736b2b4ba790f8c231a WatchSource:0}: Error finding container 96256b1d14665f77f54fe3aaafca0b5fcf4518289de21736b2b4ba790f8c231a: Status 404 returned error can't find the container with id 96256b1d14665f77f54fe3aaafca0b5fcf4518289de21736b2b4ba790f8c231a Apr 23 18:01:17.829205 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:17.829151 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bgn7x" event={"ID":"5240d464-6fd9-4f8a-819f-0385f4314995","Type":"ContainerStarted","Data":"96256b1d14665f77f54fe3aaafca0b5fcf4518289de21736b2b4ba790f8c231a"} Apr 23 18:01:17.830422 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:17.830377 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gzbjc" event={"ID":"b84d1492-2668-484a-a1a9-0c1404a30918","Type":"ContainerStarted","Data":"6b0d3e2ec00ade91336bb34978a9859b6dd4d592cc989b4a291150ec4d8f4eb0"} Apr 23 18:01:18.834678 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:18.834642 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gzbjc" event={"ID":"b84d1492-2668-484a-a1a9-0c1404a30918","Type":"ContainerStarted","Data":"9463cdf63eb903bf776e72299b8f392d81cb892152ee27153ccf72ab80c1bf3d"} Apr 23 18:01:18.834678 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:18.834683 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gzbjc" event={"ID":"b84d1492-2668-484a-a1a9-0c1404a30918","Type":"ContainerStarted","Data":"60af01e891c8c74566c2151fd73325c3b85a58a29172d281ddbd10c8d2bb80b5"} Apr 23 18:01:18.836146 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:18.836123 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bgn7x" event={"ID":"5240d464-6fd9-4f8a-819f-0385f4314995","Type":"ContainerStarted","Data":"1eb286ad5286583a8bc093c4cd93b9b412e3ecd02ad7122c6d390f500a38d44a"} Apr 23 18:01:18.836259 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:18.836152 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bgn7x" event={"ID":"5240d464-6fd9-4f8a-819f-0385f4314995","Type":"ContainerStarted","Data":"0954e31f7db08bfaa5c1920664e69e00aefe6302bf497c1312d8f1cacfd401f1"} Apr 23 18:01:18.836314 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:18.836282 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-bgn7x" Apr 23 18:01:18.851403 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:18.851353 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-gzbjc" podStartSLOduration=3.330611774 podStartE2EDuration="4.851339427s" podCreationTimestamp="2026-04-23 18:01:14 +0000 UTC" firstStartedPulling="2026-04-23 18:01:16.902310181 +0000 UTC m=+161.136052075" lastFinishedPulling="2026-04-23 18:01:18.42303783 +0000 UTC m=+162.656779728" observedRunningTime="2026-04-23 18:01:18.850249834 +0000 UTC m=+163.083991761" watchObservedRunningTime="2026-04-23 18:01:18.851339427 +0000 UTC m=+163.085081334" Apr 23 18:01:18.868929 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:18.868881 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-bgn7x" podStartSLOduration=129.486544716 podStartE2EDuration="2m10.868866307s" podCreationTimestamp="2026-04-23 17:59:08 +0000 UTC" firstStartedPulling="2026-04-23 18:01:17.043780081 +0000 UTC m=+161.277521975" lastFinishedPulling="2026-04-23 18:01:18.426101658 +0000 UTC m=+162.659843566" observedRunningTime="2026-04-23 18:01:18.868413527 +0000 UTC m=+163.102155444" watchObservedRunningTime="2026-04-23 18:01:18.868866307 +0000 UTC m=+163.102608222" Apr 23 18:01:20.311966 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.311930 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-b27pf"] Apr 23 18:01:20.314493 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.314471 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b27pf" Apr 23 18:01:20.316491 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.316465 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 23 18:01:20.316611 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.316488 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-9l9bb\"" Apr 23 18:01:20.316611 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.316471 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 23 18:01:20.323449 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.323426 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-b27pf"] Apr 23 18:01:20.347123 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.347089 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hcbrd"] Apr 23 18:01:20.349381 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.349363 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.351755 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.351733 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 18:01:20.351755 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.351753 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 18:01:20.351989 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.351794 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 18:01:20.351989 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.351861 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-sclw6\"" Apr 23 18:01:20.358054 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.358034 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/46d70cff-048a-4745-b44b-9b84f75b930e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-b27pf\" (UID: \"46d70cff-048a-4745-b44b-9b84f75b930e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b27pf" Apr 23 18:01:20.358154 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.358071 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46d70cff-048a-4745-b44b-9b84f75b930e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-b27pf\" (UID: \"46d70cff-048a-4745-b44b-9b84f75b930e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b27pf" Apr 23 18:01:20.358154 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.358129 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htkzf\" (UniqueName: \"kubernetes.io/projected/46d70cff-048a-4745-b44b-9b84f75b930e-kube-api-access-htkzf\") pod \"openshift-state-metrics-9d44df66c-b27pf\" (UID: \"46d70cff-048a-4745-b44b-9b84f75b930e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b27pf" Apr 23 18:01:20.358220 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.358172 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46d70cff-048a-4745-b44b-9b84f75b930e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-b27pf\" (UID: \"46d70cff-048a-4745-b44b-9b84f75b930e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b27pf" Apr 23 18:01:20.458872 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.458834 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-htkzf\" (UniqueName: \"kubernetes.io/projected/46d70cff-048a-4745-b44b-9b84f75b930e-kube-api-access-htkzf\") pod \"openshift-state-metrics-9d44df66c-b27pf\" (UID: \"46d70cff-048a-4745-b44b-9b84f75b930e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b27pf" Apr 23 18:01:20.459017 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.458886 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a4105c7e-1c6e-46a3-a884-5c701411dd9d-node-exporter-accelerators-collector-config\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.459017 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.458921 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46d70cff-048a-4745-b44b-9b84f75b930e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-b27pf\" (UID: \"46d70cff-048a-4745-b44b-9b84f75b930e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b27pf" Apr 23 18:01:20.459017 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.458941 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a4105c7e-1c6e-46a3-a884-5c701411dd9d-node-exporter-textfile\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.459017 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.458959 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a4105c7e-1c6e-46a3-a884-5c701411dd9d-metrics-client-ca\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.459017 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.458997 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a4105c7e-1c6e-46a3-a884-5c701411dd9d-sys\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.459231 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.459128 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a4105c7e-1c6e-46a3-a884-5c701411dd9d-node-exporter-wtmp\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.459231 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.459178 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/46d70cff-048a-4745-b44b-9b84f75b930e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-b27pf\" (UID: \"46d70cff-048a-4745-b44b-9b84f75b930e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b27pf" Apr 23 18:01:20.459297 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.459240 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a4105c7e-1c6e-46a3-a884-5c701411dd9d-root\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.459297 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.459269 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8nr6\" (UniqueName: \"kubernetes.io/projected/a4105c7e-1c6e-46a3-a884-5c701411dd9d-kube-api-access-r8nr6\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.459369 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.459305 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46d70cff-048a-4745-b44b-9b84f75b930e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-b27pf\" (UID: \"46d70cff-048a-4745-b44b-9b84f75b930e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b27pf" Apr 23 18:01:20.459369 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.459336 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a4105c7e-1c6e-46a3-a884-5c701411dd9d-node-exporter-tls\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.459463 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.459368 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a4105c7e-1c6e-46a3-a884-5c701411dd9d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.459983 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.459966 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46d70cff-048a-4745-b44b-9b84f75b930e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-b27pf\" (UID: \"46d70cff-048a-4745-b44b-9b84f75b930e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b27pf" Apr 23 18:01:20.461465 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.461438 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46d70cff-048a-4745-b44b-9b84f75b930e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-b27pf\" (UID: \"46d70cff-048a-4745-b44b-9b84f75b930e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b27pf" Apr 23 18:01:20.461599 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.461577 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/46d70cff-048a-4745-b44b-9b84f75b930e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-b27pf\" (UID: \"46d70cff-048a-4745-b44b-9b84f75b930e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b27pf" Apr 23 18:01:20.466387 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.466362 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-htkzf\" (UniqueName: \"kubernetes.io/projected/46d70cff-048a-4745-b44b-9b84f75b930e-kube-api-access-htkzf\") pod \"openshift-state-metrics-9d44df66c-b27pf\" (UID: \"46d70cff-048a-4745-b44b-9b84f75b930e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b27pf" Apr 23 18:01:20.560515 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.560475 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a4105c7e-1c6e-46a3-a884-5c701411dd9d-node-exporter-tls\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.560698 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.560544 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a4105c7e-1c6e-46a3-a884-5c701411dd9d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.560698 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.560608 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a4105c7e-1c6e-46a3-a884-5c701411dd9d-node-exporter-accelerators-collector-config\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.560698 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:01:20.560615 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 18:01:20.560698 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.560646 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a4105c7e-1c6e-46a3-a884-5c701411dd9d-node-exporter-textfile\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.560698 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:01:20.560682 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4105c7e-1c6e-46a3-a884-5c701411dd9d-node-exporter-tls podName:a4105c7e-1c6e-46a3-a884-5c701411dd9d nodeName:}" failed. No retries permitted until 2026-04-23 18:01:21.060660253 +0000 UTC m=+165.294402152 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/a4105c7e-1c6e-46a3-a884-5c701411dd9d-node-exporter-tls") pod "node-exporter-hcbrd" (UID: "a4105c7e-1c6e-46a3-a884-5c701411dd9d") : secret "node-exporter-tls" not found Apr 23 18:01:20.560945 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.560701 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a4105c7e-1c6e-46a3-a884-5c701411dd9d-metrics-client-ca\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.560945 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.560739 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a4105c7e-1c6e-46a3-a884-5c701411dd9d-sys\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.560945 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.560766 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a4105c7e-1c6e-46a3-a884-5c701411dd9d-node-exporter-wtmp\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.560945 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.560818 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a4105c7e-1c6e-46a3-a884-5c701411dd9d-root\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.560945 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.560840 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8nr6\" (UniqueName: \"kubernetes.io/projected/a4105c7e-1c6e-46a3-a884-5c701411dd9d-kube-api-access-r8nr6\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.560945 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.560855 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a4105c7e-1c6e-46a3-a884-5c701411dd9d-sys\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.560945 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.560929 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a4105c7e-1c6e-46a3-a884-5c701411dd9d-root\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.561298 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.561051 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a4105c7e-1c6e-46a3-a884-5c701411dd9d-node-exporter-wtmp\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.561298 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.561179 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a4105c7e-1c6e-46a3-a884-5c701411dd9d-node-exporter-textfile\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.561373 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.561306 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a4105c7e-1c6e-46a3-a884-5c701411dd9d-metrics-client-ca\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.562005 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.561954 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a4105c7e-1c6e-46a3-a884-5c701411dd9d-node-exporter-accelerators-collector-config\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.562840 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.562824 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a4105c7e-1c6e-46a3-a884-5c701411dd9d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.568211 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.568186 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8nr6\" (UniqueName: \"kubernetes.io/projected/a4105c7e-1c6e-46a3-a884-5c701411dd9d-kube-api-access-r8nr6\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:20.624200 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.624157 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b27pf" Apr 23 18:01:20.743388 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.743345 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-b27pf"] Apr 23 18:01:20.747443 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:01:20.747416 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46d70cff_048a_4745_b44b_9b84f75b930e.slice/crio-f8b9fa280d96bfc7b172e4d102f81664a8def1d4f8f816f3edcdf212d12a5795 WatchSource:0}: Error finding container f8b9fa280d96bfc7b172e4d102f81664a8def1d4f8f816f3edcdf212d12a5795: Status 404 returned error can't find the container with id f8b9fa280d96bfc7b172e4d102f81664a8def1d4f8f816f3edcdf212d12a5795 Apr 23 18:01:20.843173 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.843141 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b27pf" event={"ID":"46d70cff-048a-4745-b44b-9b84f75b930e","Type":"ContainerStarted","Data":"8c73a284fb20d9beb377ec6870813b912468fc1f1d939fc0c124ebc803b1dc68"} Apr 23 18:01:20.843173 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:20.843176 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b27pf" event={"ID":"46d70cff-048a-4745-b44b-9b84f75b930e","Type":"ContainerStarted","Data":"f8b9fa280d96bfc7b172e4d102f81664a8def1d4f8f816f3edcdf212d12a5795"} Apr 23 18:01:21.065724 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:21.065634 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a4105c7e-1c6e-46a3-a884-5c701411dd9d-node-exporter-tls\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:21.067975 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:21.067957 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a4105c7e-1c6e-46a3-a884-5c701411dd9d-node-exporter-tls\") pod \"node-exporter-hcbrd\" (UID: \"a4105c7e-1c6e-46a3-a884-5c701411dd9d\") " pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:21.259736 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:21.259703 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hcbrd" Apr 23 18:01:21.269486 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:01:21.269453 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4105c7e_1c6e_46a3_a884_5c701411dd9d.slice/crio-b2e67d19ac03f403a2bc34b7d0d6a90520cbdbedf6b5f99520bf8f4946eaef9d WatchSource:0}: Error finding container b2e67d19ac03f403a2bc34b7d0d6a90520cbdbedf6b5f99520bf8f4946eaef9d: Status 404 returned error can't find the container with id b2e67d19ac03f403a2bc34b7d0d6a90520cbdbedf6b5f99520bf8f4946eaef9d Apr 23 18:01:21.847452 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:21.847249 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hcbrd" event={"ID":"a4105c7e-1c6e-46a3-a884-5c701411dd9d","Type":"ContainerStarted","Data":"b2e67d19ac03f403a2bc34b7d0d6a90520cbdbedf6b5f99520bf8f4946eaef9d"} Apr 23 18:01:21.849675 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:21.849645 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b27pf" event={"ID":"46d70cff-048a-4745-b44b-9b84f75b930e","Type":"ContainerStarted","Data":"67d0e5227b4e8b369ea13c9a3abb4aa9c2d9212e23479853c09706a0797611d6"} Apr 23 18:01:22.853802 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:22.853766 2578 generic.go:358] "Generic (PLEG): container finished" podID="a4105c7e-1c6e-46a3-a884-5c701411dd9d" containerID="9ddcf1f79f4e6356f8c795e7800dfd7c44effecb95132fdc5d36b03ccc64c041" exitCode=0 Apr 23 18:01:22.854323 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:22.853858 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hcbrd" event={"ID":"a4105c7e-1c6e-46a3-a884-5c701411dd9d","Type":"ContainerDied","Data":"9ddcf1f79f4e6356f8c795e7800dfd7c44effecb95132fdc5d36b03ccc64c041"} Apr 23 18:01:22.855715 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:22.855686 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b27pf" event={"ID":"46d70cff-048a-4745-b44b-9b84f75b930e","Type":"ContainerStarted","Data":"c22ff1d2710d0f9c708de229dc477357513dfee8cfa09d4713dc6efbf55d354d"} Apr 23 18:01:22.888059 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:22.888016 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b27pf" podStartSLOduration=1.94086998 podStartE2EDuration="2.887999586s" podCreationTimestamp="2026-04-23 18:01:20 +0000 UTC" firstStartedPulling="2026-04-23 18:01:20.861781389 +0000 UTC m=+165.095523286" lastFinishedPulling="2026-04-23 18:01:21.808910985 +0000 UTC m=+166.042652892" observedRunningTime="2026-04-23 18:01:22.887254664 +0000 UTC m=+167.120996591" watchObservedRunningTime="2026-04-23 18:01:22.887999586 +0000 UTC m=+167.121741505" Apr 23 18:01:23.860026 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:23.859984 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hcbrd" event={"ID":"a4105c7e-1c6e-46a3-a884-5c701411dd9d","Type":"ContainerStarted","Data":"2eda6fee208df47acc6e8694857701a1c72b9f191f3bf70c6fc28ba880ca49a1"} Apr 23 18:01:23.860026 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:23.860031 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hcbrd" event={"ID":"a4105c7e-1c6e-46a3-a884-5c701411dd9d","Type":"ContainerStarted","Data":"9a2c7d8c2a1fe23f2af4432c37e55d6b4c9c8021268da7737058f8d731e4115e"} Apr 23 18:01:23.882003 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:23.881948 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hcbrd" podStartSLOduration=2.9971441629999998 podStartE2EDuration="3.881933154s" podCreationTimestamp="2026-04-23 18:01:20 +0000 UTC" firstStartedPulling="2026-04-23 18:01:21.271662227 +0000 UTC m=+165.505404121" lastFinishedPulling="2026-04-23 18:01:22.156451218 +0000 UTC m=+166.390193112" observedRunningTime="2026-04-23 18:01:23.880863119 +0000 UTC m=+168.114605038" watchObservedRunningTime="2026-04-23 18:01:23.881933154 +0000 UTC m=+168.115675108" Apr 23 18:01:24.645720 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.645625 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-77759d78bc-x8rbw"] Apr 23 18:01:24.649152 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.649124 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:24.651328 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.651297 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 23 18:01:24.652182 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.652158 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-63ttna7ddk5is\"" Apr 23 18:01:24.652322 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.652158 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-798bc\"" Apr 23 18:01:24.652322 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.652217 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 23 18:01:24.652406 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.652357 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 23 18:01:24.652440 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.652419 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 18:01:24.658019 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.657995 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-77759d78bc-x8rbw"] Apr 23 18:01:24.696508 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.696471 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d70a0dc-d43e-49a9-9f20-85985564bd98-client-ca-bundle\") pod \"metrics-server-77759d78bc-x8rbw\" (UID: \"0d70a0dc-d43e-49a9-9f20-85985564bd98\") " pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:24.696508 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.696515 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0d70a0dc-d43e-49a9-9f20-85985564bd98-audit-log\") pod \"metrics-server-77759d78bc-x8rbw\" (UID: \"0d70a0dc-d43e-49a9-9f20-85985564bd98\") " pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:24.696766 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.696566 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0d70a0dc-d43e-49a9-9f20-85985564bd98-secret-metrics-server-tls\") pod \"metrics-server-77759d78bc-x8rbw\" (UID: \"0d70a0dc-d43e-49a9-9f20-85985564bd98\") " pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:24.696766 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.696593 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2jqv\" (UniqueName: \"kubernetes.io/projected/0d70a0dc-d43e-49a9-9f20-85985564bd98-kube-api-access-v2jqv\") pod \"metrics-server-77759d78bc-x8rbw\" (UID: \"0d70a0dc-d43e-49a9-9f20-85985564bd98\") " pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:24.696766 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.696612 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d70a0dc-d43e-49a9-9f20-85985564bd98-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-77759d78bc-x8rbw\" (UID: \"0d70a0dc-d43e-49a9-9f20-85985564bd98\") " pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:24.696766 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.696632 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0d70a0dc-d43e-49a9-9f20-85985564bd98-metrics-server-audit-profiles\") pod \"metrics-server-77759d78bc-x8rbw\" (UID: \"0d70a0dc-d43e-49a9-9f20-85985564bd98\") " pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:24.696766 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.696726 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/0d70a0dc-d43e-49a9-9f20-85985564bd98-secret-metrics-server-client-certs\") pod \"metrics-server-77759d78bc-x8rbw\" (UID: \"0d70a0dc-d43e-49a9-9f20-85985564bd98\") " pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:24.797419 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.797383 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0d70a0dc-d43e-49a9-9f20-85985564bd98-audit-log\") pod \"metrics-server-77759d78bc-x8rbw\" (UID: \"0d70a0dc-d43e-49a9-9f20-85985564bd98\") " pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:24.797631 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.797435 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0d70a0dc-d43e-49a9-9f20-85985564bd98-secret-metrics-server-tls\") pod \"metrics-server-77759d78bc-x8rbw\" (UID: \"0d70a0dc-d43e-49a9-9f20-85985564bd98\") " pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:24.797631 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.797457 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2jqv\" (UniqueName: \"kubernetes.io/projected/0d70a0dc-d43e-49a9-9f20-85985564bd98-kube-api-access-v2jqv\") pod \"metrics-server-77759d78bc-x8rbw\" (UID: \"0d70a0dc-d43e-49a9-9f20-85985564bd98\") " pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:24.797631 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.797477 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d70a0dc-d43e-49a9-9f20-85985564bd98-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-77759d78bc-x8rbw\" (UID: \"0d70a0dc-d43e-49a9-9f20-85985564bd98\") " pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:24.797631 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.797505 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0d70a0dc-d43e-49a9-9f20-85985564bd98-metrics-server-audit-profiles\") pod \"metrics-server-77759d78bc-x8rbw\" (UID: \"0d70a0dc-d43e-49a9-9f20-85985564bd98\") " pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:24.797631 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.797600 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/0d70a0dc-d43e-49a9-9f20-85985564bd98-secret-metrics-server-client-certs\") pod \"metrics-server-77759d78bc-x8rbw\" (UID: \"0d70a0dc-d43e-49a9-9f20-85985564bd98\") " pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:24.797897 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.797671 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d70a0dc-d43e-49a9-9f20-85985564bd98-client-ca-bundle\") pod \"metrics-server-77759d78bc-x8rbw\" (UID: \"0d70a0dc-d43e-49a9-9f20-85985564bd98\") " pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:24.797897 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.797836 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0d70a0dc-d43e-49a9-9f20-85985564bd98-audit-log\") pod \"metrics-server-77759d78bc-x8rbw\" (UID: \"0d70a0dc-d43e-49a9-9f20-85985564bd98\") " pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:24.798793 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.798762 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0d70a0dc-d43e-49a9-9f20-85985564bd98-metrics-server-audit-profiles\") pod \"metrics-server-77759d78bc-x8rbw\" (UID: \"0d70a0dc-d43e-49a9-9f20-85985564bd98\") " pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:24.799102 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.799077 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d70a0dc-d43e-49a9-9f20-85985564bd98-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-77759d78bc-x8rbw\" (UID: \"0d70a0dc-d43e-49a9-9f20-85985564bd98\") " pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:24.803852 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.803817 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d70a0dc-d43e-49a9-9f20-85985564bd98-client-ca-bundle\") pod \"metrics-server-77759d78bc-x8rbw\" (UID: \"0d70a0dc-d43e-49a9-9f20-85985564bd98\") " pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:24.803978 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.803878 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0d70a0dc-d43e-49a9-9f20-85985564bd98-secret-metrics-server-tls\") pod \"metrics-server-77759d78bc-x8rbw\" (UID: \"0d70a0dc-d43e-49a9-9f20-85985564bd98\") " pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:24.803978 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.803954 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/0d70a0dc-d43e-49a9-9f20-85985564bd98-secret-metrics-server-client-certs\") pod \"metrics-server-77759d78bc-x8rbw\" (UID: \"0d70a0dc-d43e-49a9-9f20-85985564bd98\") " pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:24.807333 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.807307 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2jqv\" (UniqueName: \"kubernetes.io/projected/0d70a0dc-d43e-49a9-9f20-85985564bd98-kube-api-access-v2jqv\") pod \"metrics-server-77759d78bc-x8rbw\" (UID: \"0d70a0dc-d43e-49a9-9f20-85985564bd98\") " pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:24.834901 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.834871 2578 patch_prober.go:28] interesting pod/image-registry-776bb79f5c-2r8bg container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 18:01:24.835070 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.834926 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" podUID="d5c381f1-454c-4290-9c44-d067a94c399b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:01:24.959362 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:24.959266 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:25.089799 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.089768 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-77759d78bc-x8rbw"] Apr 23 18:01:25.092905 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:01:25.092876 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d70a0dc_d43e_49a9_9f20_85985564bd98.slice/crio-1fa5c493cf13fc76451e2bf1f353d06c1dcd7bd728aec63918ac2ea939758815 WatchSource:0}: Error finding container 1fa5c493cf13fc76451e2bf1f353d06c1dcd7bd728aec63918ac2ea939758815: Status 404 returned error can't find the container with id 1fa5c493cf13fc76451e2bf1f353d06c1dcd7bd728aec63918ac2ea939758815 Apr 23 18:01:25.105265 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.105237 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-f8bkz"] Apr 23 18:01:25.109512 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.109493 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-f8bkz" Apr 23 18:01:25.114252 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.114030 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 23 18:01:25.114580 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.113935 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-n9zgx\"" Apr 23 18:01:25.114737 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.114698 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-f8bkz"] Apr 23 18:01:25.200812 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.200764 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/15a23a51-53a4-43ca-950f-3449fae6160c-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-f8bkz\" (UID: \"15a23a51-53a4-43ca-950f-3449fae6160c\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-f8bkz" Apr 23 18:01:25.302151 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.302095 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/15a23a51-53a4-43ca-950f-3449fae6160c-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-f8bkz\" (UID: \"15a23a51-53a4-43ca-950f-3449fae6160c\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-f8bkz" Apr 23 18:01:25.302323 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:01:25.302259 2578 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 23 18:01:25.302364 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:01:25.302324 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15a23a51-53a4-43ca-950f-3449fae6160c-monitoring-plugin-cert podName:15a23a51-53a4-43ca-950f-3449fae6160c nodeName:}" failed. No retries permitted until 2026-04-23 18:01:25.802307672 +0000 UTC m=+170.036049566 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/15a23a51-53a4-43ca-950f-3449fae6160c-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-f8bkz" (UID: "15a23a51-53a4-43ca-950f-3449fae6160c") : secret "monitoring-plugin-cert" not found Apr 23 18:01:25.573031 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.572947 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz"] Apr 23 18:01:25.576427 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.576401 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.578828 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.578803 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-6ch25\"" Apr 23 18:01:25.578957 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.578844 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 23 18:01:25.578957 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.578813 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 23 18:01:25.578957 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.578895 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 23 18:01:25.578957 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.578912 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 23 18:01:25.579167 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.579132 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 23 18:01:25.585273 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.585249 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 23 18:01:25.589252 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.589223 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz"] Apr 23 18:01:25.705428 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.705375 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-telemeter-client-tls\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.705428 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.705430 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-federate-client-tls\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.705691 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.705451 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-metrics-client-ca\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.705691 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.705589 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5v6s\" (UniqueName: \"kubernetes.io/projected/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-kube-api-access-t5v6s\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.705691 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.705626 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-secret-telemeter-client\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.705691 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.705651 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.705691 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.705678 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-telemeter-trusted-ca-bundle\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.705843 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.705702 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-serving-certs-ca-bundle\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.807021 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.806986 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-secret-telemeter-client\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.807021 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.807029 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.807281 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.807054 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-telemeter-trusted-ca-bundle\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.807281 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.807077 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-serving-certs-ca-bundle\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.807281 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.807136 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-telemeter-client-tls\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.807281 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.807185 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-federate-client-tls\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.807281 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.807213 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-metrics-client-ca\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.807921 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.807891 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/15a23a51-53a4-43ca-950f-3449fae6160c-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-f8bkz\" (UID: \"15a23a51-53a4-43ca-950f-3449fae6160c\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-f8bkz" Apr 23 18:01:25.808054 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.807947 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-metrics-client-ca\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.808054 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.807954 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5v6s\" (UniqueName: \"kubernetes.io/projected/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-kube-api-access-t5v6s\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.808177 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.808092 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-telemeter-trusted-ca-bundle\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.808672 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.808618 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-serving-certs-ca-bundle\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.810183 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.810154 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-telemeter-client-tls\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.810183 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.810171 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-secret-telemeter-client\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.810353 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.810210 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.810794 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.810769 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/15a23a51-53a4-43ca-950f-3449fae6160c-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-f8bkz\" (UID: \"15a23a51-53a4-43ca-950f-3449fae6160c\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-f8bkz" Apr 23 18:01:25.811012 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.810993 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-federate-client-tls\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.816331 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.816311 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5v6s\" (UniqueName: \"kubernetes.io/projected/2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc-kube-api-access-t5v6s\") pod \"telemeter-client-54cfcb5d4b-ngzbz\" (UID: \"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc\") " pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:25.868483 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.868389 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" event={"ID":"0d70a0dc-d43e-49a9-9f20-85985564bd98","Type":"ContainerStarted","Data":"1fa5c493cf13fc76451e2bf1f353d06c1dcd7bd728aec63918ac2ea939758815"} Apr 23 18:01:25.886639 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:25.886603 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" Apr 23 18:01:26.020943 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:26.020903 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-f8bkz" Apr 23 18:01:26.033165 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:26.033126 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz"] Apr 23 18:01:26.037899 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:01:26.037856 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fcc7b2c_3477_450d_bd6a_b9d51c4abbcc.slice/crio-960b8a4cecd65538bca0d86f39b42ae9e602f1f550fc8ffdf0f1e96cfa30807e WatchSource:0}: Error finding container 960b8a4cecd65538bca0d86f39b42ae9e602f1f550fc8ffdf0f1e96cfa30807e: Status 404 returned error can't find the container with id 960b8a4cecd65538bca0d86f39b42ae9e602f1f550fc8ffdf0f1e96cfa30807e Apr 23 18:01:26.159503 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:26.159470 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-f8bkz"] Apr 23 18:01:26.359893 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:26.359848 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-86jl7" Apr 23 18:01:26.364204 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:26.363670 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tsbqj\"" Apr 23 18:01:26.372122 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:26.370346 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-86jl7" Apr 23 18:01:26.745341 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:26.743081 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-86jl7"] Apr 23 18:01:26.747218 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:01:26.747176 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff148188_17a2_4b88_a857_ae14164f4a06.slice/crio-b76363b315771b2c10c774834059da655c60b0a21bb0a71630a565f2add8e6c9 WatchSource:0}: Error finding container b76363b315771b2c10c774834059da655c60b0a21bb0a71630a565f2add8e6c9: Status 404 returned error can't find the container with id b76363b315771b2c10c774834059da655c60b0a21bb0a71630a565f2add8e6c9 Apr 23 18:01:26.873133 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:26.873032 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-86jl7" event={"ID":"ff148188-17a2-4b88-a857-ae14164f4a06","Type":"ContainerStarted","Data":"b76363b315771b2c10c774834059da655c60b0a21bb0a71630a565f2add8e6c9"} Apr 23 18:01:26.874568 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:26.874489 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-f8bkz" event={"ID":"15a23a51-53a4-43ca-950f-3449fae6160c","Type":"ContainerStarted","Data":"35b17a17a0c688e540fbf44f391cba0a6e8c43b5cd0ceed2128716e883ba7f3f"} Apr 23 18:01:26.876914 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:26.876849 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" event={"ID":"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc","Type":"ContainerStarted","Data":"960b8a4cecd65538bca0d86f39b42ae9e602f1f550fc8ffdf0f1e96cfa30807e"} Apr 23 18:01:26.879364 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:26.879313 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" event={"ID":"0d70a0dc-d43e-49a9-9f20-85985564bd98","Type":"ContainerStarted","Data":"0da900b37eb536bb4df6d2fa96c3a9aecdae71529b65f4e062b1b06a3b549f92"} Apr 23 18:01:26.897151 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:26.897089 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" podStartSLOduration=1.338084153 podStartE2EDuration="2.897068159s" podCreationTimestamp="2026-04-23 18:01:24 +0000 UTC" firstStartedPulling="2026-04-23 18:01:25.095358627 +0000 UTC m=+169.329100520" lastFinishedPulling="2026-04-23 18:01:26.654342618 +0000 UTC m=+170.888084526" observedRunningTime="2026-04-23 18:01:26.896973452 +0000 UTC m=+171.130715369" watchObservedRunningTime="2026-04-23 18:01:26.897068159 +0000 UTC m=+171.130810072" Apr 23 18:01:27.353676 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:27.353634 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 18:01:28.841871 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:28.841836 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-bgn7x" Apr 23 18:01:29.849951 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:29.849907 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" podUID="d5c381f1-454c-4290-9c44-d067a94c399b" containerName="registry" containerID="cri-o://8464a195c30b2f4fd6971056314b331e2622bc1451c38246fe0eabf7ba1bb2bd" gracePeriod=30 Apr 23 18:01:29.890730 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:29.890669 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-86jl7" event={"ID":"ff148188-17a2-4b88-a857-ae14164f4a06","Type":"ContainerStarted","Data":"abc8b6f76c489c0ab4c0ca120cb93ea200a0f0be160c43fc239d54f2819d97fd"} Apr 23 18:01:29.892332 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:29.892297 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-f8bkz" event={"ID":"15a23a51-53a4-43ca-950f-3449fae6160c","Type":"ContainerStarted","Data":"2e2004e70c93eb7a42c36099f84d67e98adc7a88e23549a2b7e71091c676bc83"} Apr 23 18:01:29.892557 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:29.892512 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-f8bkz" Apr 23 18:01:29.893898 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:29.893858 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" event={"ID":"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc","Type":"ContainerStarted","Data":"f6a38f23cda43fc53463e4b97330e6790e05b1e6597a900d9b095b885f677ad4"} Apr 23 18:01:29.898753 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:29.898717 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-f8bkz" Apr 23 18:01:29.911551 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:29.911494 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-86jl7" podStartSLOduration=139.720352278 podStartE2EDuration="2m21.911471371s" podCreationTimestamp="2026-04-23 17:59:08 +0000 UTC" firstStartedPulling="2026-04-23 18:01:26.749229559 +0000 UTC m=+170.982971453" lastFinishedPulling="2026-04-23 18:01:28.940348649 +0000 UTC m=+173.174090546" observedRunningTime="2026-04-23 18:01:29.90991976 +0000 UTC m=+174.143661677" watchObservedRunningTime="2026-04-23 18:01:29.911471371 +0000 UTC m=+174.145213287" Apr 23 18:01:29.925554 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:29.925476 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-f8bkz" podStartSLOduration=2.581251093 podStartE2EDuration="4.925456964s" podCreationTimestamp="2026-04-23 18:01:25 +0000 UTC" firstStartedPulling="2026-04-23 18:01:26.591797583 +0000 UTC m=+170.825539481" lastFinishedPulling="2026-04-23 18:01:28.936003443 +0000 UTC m=+173.169745352" observedRunningTime="2026-04-23 18:01:29.924522646 +0000 UTC m=+174.158264562" watchObservedRunningTime="2026-04-23 18:01:29.925456964 +0000 UTC m=+174.159198879" Apr 23 18:01:30.212519 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.212495 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:01:30.354922 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.354881 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d5c381f1-454c-4290-9c44-d067a94c399b-trusted-ca\") pod \"d5c381f1-454c-4290-9c44-d067a94c399b\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " Apr 23 18:01:30.354922 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.354922 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d5c381f1-454c-4290-9c44-d067a94c399b-ca-trust-extracted\") pod \"d5c381f1-454c-4290-9c44-d067a94c399b\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " Apr 23 18:01:30.355167 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.354959 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-bound-sa-token\") pod \"d5c381f1-454c-4290-9c44-d067a94c399b\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " Apr 23 18:01:30.355167 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.355056 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvzvn\" (UniqueName: \"kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-kube-api-access-nvzvn\") pod \"d5c381f1-454c-4290-9c44-d067a94c399b\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " Apr 23 18:01:30.355167 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.355103 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d5c381f1-454c-4290-9c44-d067a94c399b-image-registry-private-configuration\") pod \"d5c381f1-454c-4290-9c44-d067a94c399b\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " Apr 23 18:01:30.355167 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.355147 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d5c381f1-454c-4290-9c44-d067a94c399b-installation-pull-secrets\") pod \"d5c381f1-454c-4290-9c44-d067a94c399b\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " Apr 23 18:01:30.355356 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.355190 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d5c381f1-454c-4290-9c44-d067a94c399b-registry-certificates\") pod \"d5c381f1-454c-4290-9c44-d067a94c399b\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " Apr 23 18:01:30.355356 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.355244 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-registry-tls\") pod \"d5c381f1-454c-4290-9c44-d067a94c399b\" (UID: \"d5c381f1-454c-4290-9c44-d067a94c399b\") " Apr 23 18:01:30.355356 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.355333 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5c381f1-454c-4290-9c44-d067a94c399b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d5c381f1-454c-4290-9c44-d067a94c399b" (UID: "d5c381f1-454c-4290-9c44-d067a94c399b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:01:30.355680 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.355525 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d5c381f1-454c-4290-9c44-d067a94c399b-trusted-ca\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:01:30.355996 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.355953 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5c381f1-454c-4290-9c44-d067a94c399b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d5c381f1-454c-4290-9c44-d067a94c399b" (UID: "d5c381f1-454c-4290-9c44-d067a94c399b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:01:30.357876 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.357833 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5c381f1-454c-4290-9c44-d067a94c399b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d5c381f1-454c-4290-9c44-d067a94c399b" (UID: "d5c381f1-454c-4290-9c44-d067a94c399b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:01:30.357974 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.357881 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d5c381f1-454c-4290-9c44-d067a94c399b" (UID: "d5c381f1-454c-4290-9c44-d067a94c399b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:01:30.357974 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.357903 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d5c381f1-454c-4290-9c44-d067a94c399b" (UID: "d5c381f1-454c-4290-9c44-d067a94c399b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:01:30.357974 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.357897 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-kube-api-access-nvzvn" (OuterVolumeSpecName: "kube-api-access-nvzvn") pod "d5c381f1-454c-4290-9c44-d067a94c399b" (UID: "d5c381f1-454c-4290-9c44-d067a94c399b"). InnerVolumeSpecName "kube-api-access-nvzvn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:01:30.358194 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.358169 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5c381f1-454c-4290-9c44-d067a94c399b-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "d5c381f1-454c-4290-9c44-d067a94c399b" (UID: "d5c381f1-454c-4290-9c44-d067a94c399b"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:01:30.363794 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.363740 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5c381f1-454c-4290-9c44-d067a94c399b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d5c381f1-454c-4290-9c44-d067a94c399b" (UID: "d5c381f1-454c-4290-9c44-d067a94c399b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:01:30.456429 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.456388 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-registry-tls\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:01:30.456429 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.456421 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d5c381f1-454c-4290-9c44-d067a94c399b-ca-trust-extracted\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:01:30.456429 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.456431 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-bound-sa-token\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:01:30.456429 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.456439 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nvzvn\" (UniqueName: \"kubernetes.io/projected/d5c381f1-454c-4290-9c44-d067a94c399b-kube-api-access-nvzvn\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:01:30.456719 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.456452 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d5c381f1-454c-4290-9c44-d067a94c399b-image-registry-private-configuration\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:01:30.456719 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.456461 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d5c381f1-454c-4290-9c44-d067a94c399b-installation-pull-secrets\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:01:30.456719 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.456470 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d5c381f1-454c-4290-9c44-d067a94c399b-registry-certificates\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:01:30.897959 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.897924 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" event={"ID":"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc","Type":"ContainerStarted","Data":"3bbf7ef271accfe991dac03b6777fe456dd3b38e113a32724fc880cb69dfb4ec"} Apr 23 18:01:30.898366 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.897965 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" event={"ID":"2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc","Type":"ContainerStarted","Data":"1ba027c3edc70ef838b841fbe8051599befa10e493275d17983a7caa5b3b59a5"} Apr 23 18:01:30.899085 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.899062 2578 generic.go:358] "Generic (PLEG): container finished" podID="d5c381f1-454c-4290-9c44-d067a94c399b" containerID="8464a195c30b2f4fd6971056314b331e2622bc1451c38246fe0eabf7ba1bb2bd" exitCode=0 Apr 23 18:01:30.899129 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.899112 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" Apr 23 18:01:30.899159 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.899143 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" event={"ID":"d5c381f1-454c-4290-9c44-d067a94c399b","Type":"ContainerDied","Data":"8464a195c30b2f4fd6971056314b331e2622bc1451c38246fe0eabf7ba1bb2bd"} Apr 23 18:01:30.899189 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.899172 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-776bb79f5c-2r8bg" event={"ID":"d5c381f1-454c-4290-9c44-d067a94c399b","Type":"ContainerDied","Data":"8ab8a6c237783ecdb01791c0e321c08bb9297777012a70d35a9b58dfe391c44c"} Apr 23 18:01:30.899243 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.899190 2578 scope.go:117] "RemoveContainer" containerID="8464a195c30b2f4fd6971056314b331e2622bc1451c38246fe0eabf7ba1bb2bd" Apr 23 18:01:30.907479 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.907459 2578 scope.go:117] "RemoveContainer" containerID="8464a195c30b2f4fd6971056314b331e2622bc1451c38246fe0eabf7ba1bb2bd" Apr 23 18:01:30.907829 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:01:30.907806 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8464a195c30b2f4fd6971056314b331e2622bc1451c38246fe0eabf7ba1bb2bd\": container with ID starting with 8464a195c30b2f4fd6971056314b331e2622bc1451c38246fe0eabf7ba1bb2bd not found: ID does not exist" containerID="8464a195c30b2f4fd6971056314b331e2622bc1451c38246fe0eabf7ba1bb2bd" Apr 23 18:01:30.907894 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.907838 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8464a195c30b2f4fd6971056314b331e2622bc1451c38246fe0eabf7ba1bb2bd"} err="failed to get container status \"8464a195c30b2f4fd6971056314b331e2622bc1451c38246fe0eabf7ba1bb2bd\": rpc error: code = NotFound desc = could not find container \"8464a195c30b2f4fd6971056314b331e2622bc1451c38246fe0eabf7ba1bb2bd\": container with ID starting with 8464a195c30b2f4fd6971056314b331e2622bc1451c38246fe0eabf7ba1bb2bd not found: ID does not exist" Apr 23 18:01:30.919921 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.919875 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-54cfcb5d4b-ngzbz" podStartSLOduration=1.845443483 podStartE2EDuration="5.919861548s" podCreationTimestamp="2026-04-23 18:01:25 +0000 UTC" firstStartedPulling="2026-04-23 18:01:26.04014934 +0000 UTC m=+170.273891240" lastFinishedPulling="2026-04-23 18:01:30.114567398 +0000 UTC m=+174.348309305" observedRunningTime="2026-04-23 18:01:30.918684522 +0000 UTC m=+175.152426450" watchObservedRunningTime="2026-04-23 18:01:30.919861548 +0000 UTC m=+175.153603464" Apr 23 18:01:30.934156 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.934128 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-776bb79f5c-2r8bg"] Apr 23 18:01:30.937714 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:30.937689 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-776bb79f5c-2r8bg"] Apr 23 18:01:31.913261 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:31.913233 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-cf66f8b75-qx98g"] Apr 23 18:01:31.913633 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:31.913491 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5c381f1-454c-4290-9c44-d067a94c399b" containerName="registry" Apr 23 18:01:31.913633 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:31.913501 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c381f1-454c-4290-9c44-d067a94c399b" containerName="registry" Apr 23 18:01:31.913633 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:31.913576 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d5c381f1-454c-4290-9c44-d067a94c399b" containerName="registry" Apr 23 18:01:31.918301 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:31.918278 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:31.920371 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:31.920353 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 18:01:31.920371 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:31.920364 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 18:01:31.920515 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:31.920392 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 18:01:31.921048 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:31.921033 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 18:01:31.921337 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:31.921321 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jm8xd\"" Apr 23 18:01:31.921380 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:31.921353 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 18:01:31.921428 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:31.921326 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 18:01:31.921428 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:31.921403 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 18:01:31.925314 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:31.925294 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 18:01:31.929415 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:31.929394 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cf66f8b75-qx98g"] Apr 23 18:01:32.074423 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.074391 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5fed1806-5d5c-403d-827f-2f49edadee05-console-oauth-config\") pod \"console-cf66f8b75-qx98g\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:32.074423 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.074429 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct8c6\" (UniqueName: \"kubernetes.io/projected/5fed1806-5d5c-403d-827f-2f49edadee05-kube-api-access-ct8c6\") pod \"console-cf66f8b75-qx98g\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:32.074641 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.074456 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-oauth-serving-cert\") pod \"console-cf66f8b75-qx98g\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:32.074641 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.074527 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5fed1806-5d5c-403d-827f-2f49edadee05-console-serving-cert\") pod \"console-cf66f8b75-qx98g\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:32.074641 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.074590 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-console-config\") pod \"console-cf66f8b75-qx98g\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:32.074641 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.074609 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-trusted-ca-bundle\") pod \"console-cf66f8b75-qx98g\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:32.074775 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.074652 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-service-ca\") pod \"console-cf66f8b75-qx98g\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:32.176033 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.175916 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-service-ca\") pod \"console-cf66f8b75-qx98g\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:32.176033 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.176001 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5fed1806-5d5c-403d-827f-2f49edadee05-console-oauth-config\") pod \"console-cf66f8b75-qx98g\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:32.176033 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.176021 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ct8c6\" (UniqueName: \"kubernetes.io/projected/5fed1806-5d5c-403d-827f-2f49edadee05-kube-api-access-ct8c6\") pod \"console-cf66f8b75-qx98g\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:32.176033 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.176045 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-oauth-serving-cert\") pod \"console-cf66f8b75-qx98g\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:32.176365 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.176063 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5fed1806-5d5c-403d-827f-2f49edadee05-console-serving-cert\") pod \"console-cf66f8b75-qx98g\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:32.176365 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.176208 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-console-config\") pod \"console-cf66f8b75-qx98g\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:32.176365 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.176253 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-trusted-ca-bundle\") pod \"console-cf66f8b75-qx98g\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:32.176912 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.176887 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-service-ca\") pod \"console-cf66f8b75-qx98g\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:32.177004 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.176887 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-console-config\") pod \"console-cf66f8b75-qx98g\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:32.177004 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.176944 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-oauth-serving-cert\") pod \"console-cf66f8b75-qx98g\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:32.177182 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.177161 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-trusted-ca-bundle\") pod \"console-cf66f8b75-qx98g\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:32.178613 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.178592 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5fed1806-5d5c-403d-827f-2f49edadee05-console-oauth-config\") pod \"console-cf66f8b75-qx98g\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:32.178830 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.178811 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5fed1806-5d5c-403d-827f-2f49edadee05-console-serving-cert\") pod \"console-cf66f8b75-qx98g\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:32.183977 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.183953 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct8c6\" (UniqueName: \"kubernetes.io/projected/5fed1806-5d5c-403d-827f-2f49edadee05-kube-api-access-ct8c6\") pod \"console-cf66f8b75-qx98g\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:32.229119 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.229094 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:32.348887 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.348863 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cf66f8b75-qx98g"] Apr 23 18:01:32.351332 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:01:32.351299 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fed1806_5d5c_403d_827f_2f49edadee05.slice/crio-5848ecb14acc1a75c42d663a62c8cfbdb6fdefadaf204f9197722771ced2b966 WatchSource:0}: Error finding container 5848ecb14acc1a75c42d663a62c8cfbdb6fdefadaf204f9197722771ced2b966: Status 404 returned error can't find the container with id 5848ecb14acc1a75c42d663a62c8cfbdb6fdefadaf204f9197722771ced2b966 Apr 23 18:01:32.358789 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.358742 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5c381f1-454c-4290-9c44-d067a94c399b" path="/var/lib/kubelet/pods/d5c381f1-454c-4290-9c44-d067a94c399b/volumes" Apr 23 18:01:32.909088 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:32.909045 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cf66f8b75-qx98g" event={"ID":"5fed1806-5d5c-403d-827f-2f49edadee05","Type":"ContainerStarted","Data":"5848ecb14acc1a75c42d663a62c8cfbdb6fdefadaf204f9197722771ced2b966"} Apr 23 18:01:35.920365 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:35.920323 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cf66f8b75-qx98g" event={"ID":"5fed1806-5d5c-403d-827f-2f49edadee05","Type":"ContainerStarted","Data":"6e784b1960937d1e6a14114dd5640d995be05c6b577d502d98b19046e504a936"} Apr 23 18:01:35.941162 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:35.941112 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-cf66f8b75-qx98g" podStartSLOduration=2.378038199 podStartE2EDuration="4.941095576s" podCreationTimestamp="2026-04-23 18:01:31 +0000 UTC" firstStartedPulling="2026-04-23 18:01:32.353585916 +0000 UTC m=+176.587327810" lastFinishedPulling="2026-04-23 18:01:34.916643289 +0000 UTC m=+179.150385187" observedRunningTime="2026-04-23 18:01:35.940649114 +0000 UTC m=+180.174391030" watchObservedRunningTime="2026-04-23 18:01:35.941095576 +0000 UTC m=+180.174837492" Apr 23 18:01:37.580397 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:37.580309 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn" podUID="d9795312-09b0-4528-8cad-f3cbc488baab" containerName="acm-agent" probeResult="failure" output="Get \"http://10.133.0.8:8000/readyz\": dial tcp 10.133.0.8:8000: connect: connection refused" Apr 23 18:01:37.926915 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:37.926822 2578 generic.go:358] "Generic (PLEG): container finished" podID="d9795312-09b0-4528-8cad-f3cbc488baab" containerID="00ded0364468d315b74d395b8ff1d53ea6f17e34b2f6ded2fd26dd368b201a2e" exitCode=1 Apr 23 18:01:37.927086 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:37.926901 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn" event={"ID":"d9795312-09b0-4528-8cad-f3cbc488baab","Type":"ContainerDied","Data":"00ded0364468d315b74d395b8ff1d53ea6f17e34b2f6ded2fd26dd368b201a2e"} Apr 23 18:01:37.927321 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:37.927305 2578 scope.go:117] "RemoveContainer" containerID="00ded0364468d315b74d395b8ff1d53ea6f17e34b2f6ded2fd26dd368b201a2e" Apr 23 18:01:38.931683 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:38.931647 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn" event={"ID":"d9795312-09b0-4528-8cad-f3cbc488baab","Type":"ContainerStarted","Data":"2bae39f69d2324a7b6b6258073b68edaded930bf2ab97960931ac7bee416ce36"} Apr 23 18:01:38.932065 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:38.931901 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn" Apr 23 18:01:38.932570 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:38.932547 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d887f9654-zvxxn" Apr 23 18:01:42.229644 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:42.229594 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:42.229644 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:42.229646 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:42.234696 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:42.234664 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:42.947297 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:42.947258 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:01:44.960419 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:44.960374 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:44.960419 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:44.960424 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:01:47.885161 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:47.885130 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-64c9b47658-qqqmr_778d33bd-ade8-4471-a0d0-10670f14a624/router/0.log" Apr 23 18:01:47.899890 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:47.899857 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-86jl7_ff148188-17a2-4b88-a857-ae14164f4a06/serve-healthcheck-canary/0.log" Apr 23 18:01:55.983494 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:55.983457 2578 generic.go:358] "Generic (PLEG): container finished" podID="0af64194-8451-4345-9044-583d24fa444c" containerID="a2d1be072539cfeb1b99eff743b9bc64d4def1cb082df824489528256686d710" exitCode=0 Apr 23 18:01:55.983963 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:55.983543 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qhb84" event={"ID":"0af64194-8451-4345-9044-583d24fa444c","Type":"ContainerDied","Data":"a2d1be072539cfeb1b99eff743b9bc64d4def1cb082df824489528256686d710"} Apr 23 18:01:55.983963 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:55.983890 2578 scope.go:117] "RemoveContainer" containerID="a2d1be072539cfeb1b99eff743b9bc64d4def1cb082df824489528256686d710" Apr 23 18:01:56.989309 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:01:56.989264 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qhb84" event={"ID":"0af64194-8451-4345-9044-583d24fa444c","Type":"ContainerStarted","Data":"7fd2bf36c9ba362147d014cc3b55fcc4a8395d09c7dae6534676e7eb99450c09"} Apr 23 18:02:04.965307 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:02:04.965276 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:02:04.969365 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:02:04.969338 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-77759d78bc-x8rbw" Apr 23 18:02:48.218631 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:02:48.218597 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs\") pod \"network-metrics-daemon-xwp2q\" (UID: \"c5673cab-427f-416d-a4ba-94ac7c29dc9c\") " pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 18:02:48.220913 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:02:48.220886 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5673cab-427f-416d-a4ba-94ac7c29dc9c-metrics-certs\") pod \"network-metrics-daemon-xwp2q\" (UID: \"c5673cab-427f-416d-a4ba-94ac7c29dc9c\") " pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 18:02:48.357377 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:02:48.357340 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7tv5t\"" Apr 23 18:02:48.365337 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:02:48.365310 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xwp2q" Apr 23 18:02:48.482358 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:02:48.482331 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xwp2q"] Apr 23 18:02:48.484491 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:02:48.484463 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5673cab_427f_416d_a4ba_94ac7c29dc9c.slice/crio-f8d2e6721ad3dd8ff99046b453926500c367537f0bf2715775cd04a1b22a5654 WatchSource:0}: Error finding container f8d2e6721ad3dd8ff99046b453926500c367537f0bf2715775cd04a1b22a5654: Status 404 returned error can't find the container with id f8d2e6721ad3dd8ff99046b453926500c367537f0bf2715775cd04a1b22a5654 Apr 23 18:02:49.140212 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:02:49.140177 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xwp2q" event={"ID":"c5673cab-427f-416d-a4ba-94ac7c29dc9c","Type":"ContainerStarted","Data":"f8d2e6721ad3dd8ff99046b453926500c367537f0bf2715775cd04a1b22a5654"} Apr 23 18:02:50.145087 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:02:50.145052 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xwp2q" event={"ID":"c5673cab-427f-416d-a4ba-94ac7c29dc9c","Type":"ContainerStarted","Data":"47d3c4130b6c84b6839819f0c146caf7beb520156ee0e7442268d044187697d8"} Apr 23 18:02:50.145087 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:02:50.145087 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xwp2q" event={"ID":"c5673cab-427f-416d-a4ba-94ac7c29dc9c","Type":"ContainerStarted","Data":"87e11457674877703947709cc6f0512e25b89d289b273a833d8d6220146b530e"} Apr 23 18:02:50.162423 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:02:50.161881 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xwp2q" podStartSLOduration=253.241655756 podStartE2EDuration="4m14.161862735s" podCreationTimestamp="2026-04-23 17:58:36 +0000 UTC" firstStartedPulling="2026-04-23 18:02:48.486480461 +0000 UTC m=+252.720222355" lastFinishedPulling="2026-04-23 18:02:49.40668744 +0000 UTC m=+253.640429334" observedRunningTime="2026-04-23 18:02:50.160389996 +0000 UTC m=+254.394131921" watchObservedRunningTime="2026-04-23 18:02:50.161862735 +0000 UTC m=+254.395604652" Apr 23 18:02:52.309756 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:02:52.309723 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cf66f8b75-qx98g"] Apr 23 18:03:17.330226 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.330163 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-cf66f8b75-qx98g" podUID="5fed1806-5d5c-403d-827f-2f49edadee05" containerName="console" containerID="cri-o://6e784b1960937d1e6a14114dd5640d995be05c6b577d502d98b19046e504a936" gracePeriod=15 Apr 23 18:03:17.575583 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.575560 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cf66f8b75-qx98g_5fed1806-5d5c-403d-827f-2f49edadee05/console/0.log" Apr 23 18:03:17.575715 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.575622 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:03:17.655639 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.655523 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5fed1806-5d5c-403d-827f-2f49edadee05-console-serving-cert\") pod \"5fed1806-5d5c-403d-827f-2f49edadee05\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " Apr 23 18:03:17.655639 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.655589 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-service-ca\") pod \"5fed1806-5d5c-403d-827f-2f49edadee05\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " Apr 23 18:03:17.655639 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.655614 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-oauth-serving-cert\") pod \"5fed1806-5d5c-403d-827f-2f49edadee05\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " Apr 23 18:03:17.655639 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.655643 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5fed1806-5d5c-403d-827f-2f49edadee05-console-oauth-config\") pod \"5fed1806-5d5c-403d-827f-2f49edadee05\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " Apr 23 18:03:17.656038 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.655668 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-console-config\") pod \"5fed1806-5d5c-403d-827f-2f49edadee05\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " Apr 23 18:03:17.656038 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.655684 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-trusted-ca-bundle\") pod \"5fed1806-5d5c-403d-827f-2f49edadee05\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " Apr 23 18:03:17.656038 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.655720 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct8c6\" (UniqueName: \"kubernetes.io/projected/5fed1806-5d5c-403d-827f-2f49edadee05-kube-api-access-ct8c6\") pod \"5fed1806-5d5c-403d-827f-2f49edadee05\" (UID: \"5fed1806-5d5c-403d-827f-2f49edadee05\") " Apr 23 18:03:17.656200 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.656069 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5fed1806-5d5c-403d-827f-2f49edadee05" (UID: "5fed1806-5d5c-403d-827f-2f49edadee05"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:03:17.656200 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.656127 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-console-config" (OuterVolumeSpecName: "console-config") pod "5fed1806-5d5c-403d-827f-2f49edadee05" (UID: "5fed1806-5d5c-403d-827f-2f49edadee05"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:03:17.656276 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.656212 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5fed1806-5d5c-403d-827f-2f49edadee05" (UID: "5fed1806-5d5c-403d-827f-2f49edadee05"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:03:17.656380 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.656359 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-service-ca" (OuterVolumeSpecName: "service-ca") pod "5fed1806-5d5c-403d-827f-2f49edadee05" (UID: "5fed1806-5d5c-403d-827f-2f49edadee05"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:03:17.658025 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.657987 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fed1806-5d5c-403d-827f-2f49edadee05-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5fed1806-5d5c-403d-827f-2f49edadee05" (UID: "5fed1806-5d5c-403d-827f-2f49edadee05"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:03:17.658025 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.658002 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fed1806-5d5c-403d-827f-2f49edadee05-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5fed1806-5d5c-403d-827f-2f49edadee05" (UID: "5fed1806-5d5c-403d-827f-2f49edadee05"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:03:17.658164 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.658069 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fed1806-5d5c-403d-827f-2f49edadee05-kube-api-access-ct8c6" (OuterVolumeSpecName: "kube-api-access-ct8c6") pod "5fed1806-5d5c-403d-827f-2f49edadee05" (UID: "5fed1806-5d5c-403d-827f-2f49edadee05"). InnerVolumeSpecName "kube-api-access-ct8c6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:03:17.756690 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.756657 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-oauth-serving-cert\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:03:17.756690 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.756688 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5fed1806-5d5c-403d-827f-2f49edadee05-console-oauth-config\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:03:17.756690 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.756700 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-console-config\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:03:17.756922 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.756712 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-trusted-ca-bundle\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:03:17.756922 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.756724 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ct8c6\" (UniqueName: \"kubernetes.io/projected/5fed1806-5d5c-403d-827f-2f49edadee05-kube-api-access-ct8c6\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:03:17.756922 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.756736 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5fed1806-5d5c-403d-827f-2f49edadee05-console-serving-cert\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:03:17.756922 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:17.756748 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5fed1806-5d5c-403d-827f-2f49edadee05-service-ca\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:03:18.226814 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:18.226788 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cf66f8b75-qx98g_5fed1806-5d5c-403d-827f-2f49edadee05/console/0.log" Apr 23 18:03:18.227053 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:18.226828 2578 generic.go:358] "Generic (PLEG): container finished" podID="5fed1806-5d5c-403d-827f-2f49edadee05" containerID="6e784b1960937d1e6a14114dd5640d995be05c6b577d502d98b19046e504a936" exitCode=2 Apr 23 18:03:18.227053 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:18.226892 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cf66f8b75-qx98g" Apr 23 18:03:18.227053 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:18.226927 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cf66f8b75-qx98g" event={"ID":"5fed1806-5d5c-403d-827f-2f49edadee05","Type":"ContainerDied","Data":"6e784b1960937d1e6a14114dd5640d995be05c6b577d502d98b19046e504a936"} Apr 23 18:03:18.227053 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:18.226972 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cf66f8b75-qx98g" event={"ID":"5fed1806-5d5c-403d-827f-2f49edadee05","Type":"ContainerDied","Data":"5848ecb14acc1a75c42d663a62c8cfbdb6fdefadaf204f9197722771ced2b966"} Apr 23 18:03:18.227053 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:18.226993 2578 scope.go:117] "RemoveContainer" containerID="6e784b1960937d1e6a14114dd5640d995be05c6b577d502d98b19046e504a936" Apr 23 18:03:18.235059 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:18.235038 2578 scope.go:117] "RemoveContainer" containerID="6e784b1960937d1e6a14114dd5640d995be05c6b577d502d98b19046e504a936" Apr 23 18:03:18.235331 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:03:18.235311 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e784b1960937d1e6a14114dd5640d995be05c6b577d502d98b19046e504a936\": container with ID starting with 6e784b1960937d1e6a14114dd5640d995be05c6b577d502d98b19046e504a936 not found: ID does not exist" containerID="6e784b1960937d1e6a14114dd5640d995be05c6b577d502d98b19046e504a936" Apr 23 18:03:18.235387 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:18.235341 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e784b1960937d1e6a14114dd5640d995be05c6b577d502d98b19046e504a936"} err="failed to get container status \"6e784b1960937d1e6a14114dd5640d995be05c6b577d502d98b19046e504a936\": rpc error: code = NotFound desc = could not find container \"6e784b1960937d1e6a14114dd5640d995be05c6b577d502d98b19046e504a936\": container with ID starting with 6e784b1960937d1e6a14114dd5640d995be05c6b577d502d98b19046e504a936 not found: ID does not exist" Apr 23 18:03:18.250553 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:18.250152 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cf66f8b75-qx98g"] Apr 23 18:03:18.254005 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:18.253977 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-cf66f8b75-qx98g"] Apr 23 18:03:18.357873 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:18.357839 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fed1806-5d5c-403d-827f-2f49edadee05" path="/var/lib/kubelet/pods/5fed1806-5d5c-403d-827f-2f49edadee05/volumes" Apr 23 18:03:36.229715 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:36.229681 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:03:36.230438 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:36.230423 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:03:36.236778 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:36.236752 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:03:36.237473 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:36.237454 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:03:36.240395 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:03:36.239950 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 18:05:08.436111 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.436073 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-874ff48d-7rjqg"] Apr 23 18:05:08.436717 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.436488 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5fed1806-5d5c-403d-827f-2f49edadee05" containerName="console" Apr 23 18:05:08.436717 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.436504 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fed1806-5d5c-403d-827f-2f49edadee05" containerName="console" Apr 23 18:05:08.436717 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.436610 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="5fed1806-5d5c-403d-827f-2f49edadee05" containerName="console" Apr 23 18:05:08.438565 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.438526 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-874ff48d-7rjqg" Apr 23 18:05:08.440587 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.440564 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 18:05:08.440587 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.440565 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-sl9qp\"" Apr 23 18:05:08.440737 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.440601 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 18:05:08.440919 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.440905 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 23 18:05:08.446675 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.446650 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-snz24"] Apr 23 18:05:08.448731 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.448712 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-874ff48d-7rjqg"] Apr 23 18:05:08.448834 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.448819 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-snz24" Apr 23 18:05:08.450810 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.450787 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 23 18:05:08.450894 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.450815 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-846gk\"" Apr 23 18:05:08.461456 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.461435 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-snz24"] Apr 23 18:05:08.521159 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.521128 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae2a2be5-b4da-4fce-b6db-26318c3219ff-cert\") pod \"kserve-controller-manager-874ff48d-7rjqg\" (UID: \"ae2a2be5-b4da-4fce-b6db-26318c3219ff\") " pod="kserve/kserve-controller-manager-874ff48d-7rjqg" Apr 23 18:05:08.521337 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.521165 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c35681e-156f-4eae-8f46-35c66086eb3e-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-snz24\" (UID: \"9c35681e-156f-4eae-8f46-35c66086eb3e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-snz24" Apr 23 18:05:08.521337 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.521195 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vswfx\" (UniqueName: \"kubernetes.io/projected/9c35681e-156f-4eae-8f46-35c66086eb3e-kube-api-access-vswfx\") pod \"llmisvc-controller-manager-68cc5db7c4-snz24\" (UID: \"9c35681e-156f-4eae-8f46-35c66086eb3e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-snz24" Apr 23 18:05:08.521337 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.521251 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z756\" (UniqueName: \"kubernetes.io/projected/ae2a2be5-b4da-4fce-b6db-26318c3219ff-kube-api-access-4z756\") pod \"kserve-controller-manager-874ff48d-7rjqg\" (UID: \"ae2a2be5-b4da-4fce-b6db-26318c3219ff\") " pod="kserve/kserve-controller-manager-874ff48d-7rjqg" Apr 23 18:05:08.622175 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.622139 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae2a2be5-b4da-4fce-b6db-26318c3219ff-cert\") pod \"kserve-controller-manager-874ff48d-7rjqg\" (UID: \"ae2a2be5-b4da-4fce-b6db-26318c3219ff\") " pod="kserve/kserve-controller-manager-874ff48d-7rjqg" Apr 23 18:05:08.622175 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.622175 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c35681e-156f-4eae-8f46-35c66086eb3e-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-snz24\" (UID: \"9c35681e-156f-4eae-8f46-35c66086eb3e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-snz24" Apr 23 18:05:08.622383 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.622197 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vswfx\" (UniqueName: \"kubernetes.io/projected/9c35681e-156f-4eae-8f46-35c66086eb3e-kube-api-access-vswfx\") pod \"llmisvc-controller-manager-68cc5db7c4-snz24\" (UID: \"9c35681e-156f-4eae-8f46-35c66086eb3e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-snz24" Apr 23 18:05:08.622383 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.622215 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4z756\" (UniqueName: \"kubernetes.io/projected/ae2a2be5-b4da-4fce-b6db-26318c3219ff-kube-api-access-4z756\") pod \"kserve-controller-manager-874ff48d-7rjqg\" (UID: \"ae2a2be5-b4da-4fce-b6db-26318c3219ff\") " pod="kserve/kserve-controller-manager-874ff48d-7rjqg" Apr 23 18:05:08.622383 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:05:08.622288 2578 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 23 18:05:08.622383 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:05:08.622364 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c35681e-156f-4eae-8f46-35c66086eb3e-cert podName:9c35681e-156f-4eae-8f46-35c66086eb3e nodeName:}" failed. No retries permitted until 2026-04-23 18:05:09.122349989 +0000 UTC m=+393.356091883 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9c35681e-156f-4eae-8f46-35c66086eb3e-cert") pod "llmisvc-controller-manager-68cc5db7c4-snz24" (UID: "9c35681e-156f-4eae-8f46-35c66086eb3e") : secret "llmisvc-webhook-server-cert" not found Apr 23 18:05:08.622383 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:05:08.622293 2578 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 23 18:05:08.622581 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:05:08.622435 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2a2be5-b4da-4fce-b6db-26318c3219ff-cert podName:ae2a2be5-b4da-4fce-b6db-26318c3219ff nodeName:}" failed. No retries permitted until 2026-04-23 18:05:09.122421502 +0000 UTC m=+393.356163402 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae2a2be5-b4da-4fce-b6db-26318c3219ff-cert") pod "kserve-controller-manager-874ff48d-7rjqg" (UID: "ae2a2be5-b4da-4fce-b6db-26318c3219ff") : secret "kserve-webhook-server-cert" not found Apr 23 18:05:08.632210 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.632188 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z756\" (UniqueName: \"kubernetes.io/projected/ae2a2be5-b4da-4fce-b6db-26318c3219ff-kube-api-access-4z756\") pod \"kserve-controller-manager-874ff48d-7rjqg\" (UID: \"ae2a2be5-b4da-4fce-b6db-26318c3219ff\") " pod="kserve/kserve-controller-manager-874ff48d-7rjqg" Apr 23 18:05:08.632318 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:08.632211 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vswfx\" (UniqueName: \"kubernetes.io/projected/9c35681e-156f-4eae-8f46-35c66086eb3e-kube-api-access-vswfx\") pod \"llmisvc-controller-manager-68cc5db7c4-snz24\" (UID: \"9c35681e-156f-4eae-8f46-35c66086eb3e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-snz24" Apr 23 18:05:09.126404 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:09.126352 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae2a2be5-b4da-4fce-b6db-26318c3219ff-cert\") pod \"kserve-controller-manager-874ff48d-7rjqg\" (UID: \"ae2a2be5-b4da-4fce-b6db-26318c3219ff\") " pod="kserve/kserve-controller-manager-874ff48d-7rjqg" Apr 23 18:05:09.126404 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:09.126407 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c35681e-156f-4eae-8f46-35c66086eb3e-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-snz24\" (UID: \"9c35681e-156f-4eae-8f46-35c66086eb3e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-snz24" Apr 23 18:05:09.129134 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:09.129107 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae2a2be5-b4da-4fce-b6db-26318c3219ff-cert\") pod \"kserve-controller-manager-874ff48d-7rjqg\" (UID: \"ae2a2be5-b4da-4fce-b6db-26318c3219ff\") " pod="kserve/kserve-controller-manager-874ff48d-7rjqg" Apr 23 18:05:09.129211 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:09.129107 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c35681e-156f-4eae-8f46-35c66086eb3e-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-snz24\" (UID: \"9c35681e-156f-4eae-8f46-35c66086eb3e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-snz24" Apr 23 18:05:09.349739 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:09.349700 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-874ff48d-7rjqg" Apr 23 18:05:09.359945 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:09.359912 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-snz24" Apr 23 18:05:09.476657 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:09.476626 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-874ff48d-7rjqg"] Apr 23 18:05:09.479968 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:05:09.479925 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae2a2be5_b4da_4fce_b6db_26318c3219ff.slice/crio-a9d32fbd47d55804ff317651b594e8d3f3a5239bad107b1bf449fdeb0fa97098 WatchSource:0}: Error finding container a9d32fbd47d55804ff317651b594e8d3f3a5239bad107b1bf449fdeb0fa97098: Status 404 returned error can't find the container with id a9d32fbd47d55804ff317651b594e8d3f3a5239bad107b1bf449fdeb0fa97098 Apr 23 18:05:09.481316 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:09.481294 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:05:09.503059 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:09.503035 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-snz24"] Apr 23 18:05:09.537493 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:09.537463 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-snz24" event={"ID":"9c35681e-156f-4eae-8f46-35c66086eb3e","Type":"ContainerStarted","Data":"dd79b28080180b8f8b54f68fe248d08c92653442adb76f987aedd31c52fca4a6"} Apr 23 18:05:09.538390 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:09.538369 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-874ff48d-7rjqg" event={"ID":"ae2a2be5-b4da-4fce-b6db-26318c3219ff","Type":"ContainerStarted","Data":"a9d32fbd47d55804ff317651b594e8d3f3a5239bad107b1bf449fdeb0fa97098"} Apr 23 18:05:13.553228 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:13.553189 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-snz24" event={"ID":"9c35681e-156f-4eae-8f46-35c66086eb3e","Type":"ContainerStarted","Data":"833278939e1b6138991ec21253fa6f54e1102021890ac704c6f8d24368e9f9d4"} Apr 23 18:05:13.553725 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:13.553306 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-snz24" Apr 23 18:05:13.554487 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:13.554459 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-874ff48d-7rjqg" event={"ID":"ae2a2be5-b4da-4fce-b6db-26318c3219ff","Type":"ContainerStarted","Data":"185ffd35e500e0baaa68091adc8b2381aa4bf1901fd8195ce89d8c1e3e6a1b16"} Apr 23 18:05:13.554629 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:13.554585 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-874ff48d-7rjqg" Apr 23 18:05:13.568590 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:13.568520 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-snz24" podStartSLOduration=2.408857757 podStartE2EDuration="5.568506811s" podCreationTimestamp="2026-04-23 18:05:08 +0000 UTC" firstStartedPulling="2026-04-23 18:05:09.508253155 +0000 UTC m=+393.741995048" lastFinishedPulling="2026-04-23 18:05:12.667902207 +0000 UTC m=+396.901644102" observedRunningTime="2026-04-23 18:05:13.56790691 +0000 UTC m=+397.801648847" watchObservedRunningTime="2026-04-23 18:05:13.568506811 +0000 UTC m=+397.802248768" Apr 23 18:05:13.582650 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:13.582605 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-874ff48d-7rjqg" podStartSLOduration=2.395127662 podStartE2EDuration="5.582593743s" podCreationTimestamp="2026-04-23 18:05:08 +0000 UTC" firstStartedPulling="2026-04-23 18:05:09.481483888 +0000 UTC m=+393.715225794" lastFinishedPulling="2026-04-23 18:05:12.668949981 +0000 UTC m=+396.902691875" observedRunningTime="2026-04-23 18:05:13.581195617 +0000 UTC m=+397.814937527" watchObservedRunningTime="2026-04-23 18:05:13.582593743 +0000 UTC m=+397.816335704" Apr 23 18:05:44.560195 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:44.560162 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-snz24" Apr 23 18:05:44.563298 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:44.563270 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-874ff48d-7rjqg" Apr 23 18:05:45.922810 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:45.922772 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-874ff48d-7rjqg"] Apr 23 18:05:45.923265 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:45.922980 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-874ff48d-7rjqg" podUID="ae2a2be5-b4da-4fce-b6db-26318c3219ff" containerName="manager" containerID="cri-o://185ffd35e500e0baaa68091adc8b2381aa4bf1901fd8195ce89d8c1e3e6a1b16" gracePeriod=10 Apr 23 18:05:45.946097 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:45.946061 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-874ff48d-4pn4f"] Apr 23 18:05:46.000894 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.000861 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-874ff48d-4pn4f"] Apr 23 18:05:46.001136 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.000988 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-874ff48d-4pn4f" Apr 23 18:05:46.031793 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.031753 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53ed2eb6-0c4c-434b-ab08-e950a4695d38-cert\") pod \"kserve-controller-manager-874ff48d-4pn4f\" (UID: \"53ed2eb6-0c4c-434b-ab08-e950a4695d38\") " pod="kserve/kserve-controller-manager-874ff48d-4pn4f" Apr 23 18:05:46.031947 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.031823 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qjbp\" (UniqueName: \"kubernetes.io/projected/53ed2eb6-0c4c-434b-ab08-e950a4695d38-kube-api-access-8qjbp\") pod \"kserve-controller-manager-874ff48d-4pn4f\" (UID: \"53ed2eb6-0c4c-434b-ab08-e950a4695d38\") " pod="kserve/kserve-controller-manager-874ff48d-4pn4f" Apr 23 18:05:46.132417 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.132384 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53ed2eb6-0c4c-434b-ab08-e950a4695d38-cert\") pod \"kserve-controller-manager-874ff48d-4pn4f\" (UID: \"53ed2eb6-0c4c-434b-ab08-e950a4695d38\") " pod="kserve/kserve-controller-manager-874ff48d-4pn4f" Apr 23 18:05:46.132604 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.132444 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qjbp\" (UniqueName: \"kubernetes.io/projected/53ed2eb6-0c4c-434b-ab08-e950a4695d38-kube-api-access-8qjbp\") pod \"kserve-controller-manager-874ff48d-4pn4f\" (UID: \"53ed2eb6-0c4c-434b-ab08-e950a4695d38\") " pod="kserve/kserve-controller-manager-874ff48d-4pn4f" Apr 23 18:05:46.134920 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.134884 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53ed2eb6-0c4c-434b-ab08-e950a4695d38-cert\") pod \"kserve-controller-manager-874ff48d-4pn4f\" (UID: \"53ed2eb6-0c4c-434b-ab08-e950a4695d38\") " pod="kserve/kserve-controller-manager-874ff48d-4pn4f" Apr 23 18:05:46.140718 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.140688 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qjbp\" (UniqueName: \"kubernetes.io/projected/53ed2eb6-0c4c-434b-ab08-e950a4695d38-kube-api-access-8qjbp\") pod \"kserve-controller-manager-874ff48d-4pn4f\" (UID: \"53ed2eb6-0c4c-434b-ab08-e950a4695d38\") " pod="kserve/kserve-controller-manager-874ff48d-4pn4f" Apr 23 18:05:46.181494 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.181432 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-874ff48d-7rjqg" Apr 23 18:05:46.233383 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.233346 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z756\" (UniqueName: \"kubernetes.io/projected/ae2a2be5-b4da-4fce-b6db-26318c3219ff-kube-api-access-4z756\") pod \"ae2a2be5-b4da-4fce-b6db-26318c3219ff\" (UID: \"ae2a2be5-b4da-4fce-b6db-26318c3219ff\") " Apr 23 18:05:46.233553 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.233390 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae2a2be5-b4da-4fce-b6db-26318c3219ff-cert\") pod \"ae2a2be5-b4da-4fce-b6db-26318c3219ff\" (UID: \"ae2a2be5-b4da-4fce-b6db-26318c3219ff\") " Apr 23 18:05:46.235570 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.235516 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae2a2be5-b4da-4fce-b6db-26318c3219ff-kube-api-access-4z756" (OuterVolumeSpecName: "kube-api-access-4z756") pod "ae2a2be5-b4da-4fce-b6db-26318c3219ff" (UID: "ae2a2be5-b4da-4fce-b6db-26318c3219ff"). InnerVolumeSpecName "kube-api-access-4z756". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:05:46.235682 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.235576 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae2a2be5-b4da-4fce-b6db-26318c3219ff-cert" (OuterVolumeSpecName: "cert") pod "ae2a2be5-b4da-4fce-b6db-26318c3219ff" (UID: "ae2a2be5-b4da-4fce-b6db-26318c3219ff"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:05:46.334020 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.333977 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4z756\" (UniqueName: \"kubernetes.io/projected/ae2a2be5-b4da-4fce-b6db-26318c3219ff-kube-api-access-4z756\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:05:46.334020 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.334014 2578 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae2a2be5-b4da-4fce-b6db-26318c3219ff-cert\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:05:46.367080 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.367041 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-874ff48d-4pn4f" Apr 23 18:05:46.499072 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.499042 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-874ff48d-4pn4f"] Apr 23 18:05:46.499437 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:05:46.499412 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53ed2eb6_0c4c_434b_ab08_e950a4695d38.slice/crio-eed9cc15db9fac018b99ad24235d4c51c83cc8efe4e24afe51a0cb1d0a9a0cf7 WatchSource:0}: Error finding container eed9cc15db9fac018b99ad24235d4c51c83cc8efe4e24afe51a0cb1d0a9a0cf7: Status 404 returned error can't find the container with id eed9cc15db9fac018b99ad24235d4c51c83cc8efe4e24afe51a0cb1d0a9a0cf7 Apr 23 18:05:46.651460 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.651422 2578 generic.go:358] "Generic (PLEG): container finished" podID="ae2a2be5-b4da-4fce-b6db-26318c3219ff" containerID="185ffd35e500e0baaa68091adc8b2381aa4bf1901fd8195ce89d8c1e3e6a1b16" exitCode=0 Apr 23 18:05:46.651670 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.651511 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-874ff48d-7rjqg" event={"ID":"ae2a2be5-b4da-4fce-b6db-26318c3219ff","Type":"ContainerDied","Data":"185ffd35e500e0baaa68091adc8b2381aa4bf1901fd8195ce89d8c1e3e6a1b16"} Apr 23 18:05:46.651670 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.651578 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-874ff48d-7rjqg" event={"ID":"ae2a2be5-b4da-4fce-b6db-26318c3219ff","Type":"ContainerDied","Data":"a9d32fbd47d55804ff317651b594e8d3f3a5239bad107b1bf449fdeb0fa97098"} Apr 23 18:05:46.651670 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.651525 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-874ff48d-7rjqg" Apr 23 18:05:46.651670 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.651603 2578 scope.go:117] "RemoveContainer" containerID="185ffd35e500e0baaa68091adc8b2381aa4bf1901fd8195ce89d8c1e3e6a1b16" Apr 23 18:05:46.652841 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.652816 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-874ff48d-4pn4f" event={"ID":"53ed2eb6-0c4c-434b-ab08-e950a4695d38","Type":"ContainerStarted","Data":"eed9cc15db9fac018b99ad24235d4c51c83cc8efe4e24afe51a0cb1d0a9a0cf7"} Apr 23 18:05:46.659764 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.659745 2578 scope.go:117] "RemoveContainer" containerID="185ffd35e500e0baaa68091adc8b2381aa4bf1901fd8195ce89d8c1e3e6a1b16" Apr 23 18:05:46.660022 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:05:46.660003 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"185ffd35e500e0baaa68091adc8b2381aa4bf1901fd8195ce89d8c1e3e6a1b16\": container with ID starting with 185ffd35e500e0baaa68091adc8b2381aa4bf1901fd8195ce89d8c1e3e6a1b16 not found: ID does not exist" containerID="185ffd35e500e0baaa68091adc8b2381aa4bf1901fd8195ce89d8c1e3e6a1b16" Apr 23 18:05:46.660069 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.660034 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"185ffd35e500e0baaa68091adc8b2381aa4bf1901fd8195ce89d8c1e3e6a1b16"} err="failed to get container status \"185ffd35e500e0baaa68091adc8b2381aa4bf1901fd8195ce89d8c1e3e6a1b16\": rpc error: code = NotFound desc = could not find container \"185ffd35e500e0baaa68091adc8b2381aa4bf1901fd8195ce89d8c1e3e6a1b16\": container with ID starting with 185ffd35e500e0baaa68091adc8b2381aa4bf1901fd8195ce89d8c1e3e6a1b16 not found: ID does not exist" Apr 23 18:05:46.666658 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.666619 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-874ff48d-7rjqg"] Apr 23 18:05:46.668012 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:46.667989 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-874ff48d-7rjqg"] Apr 23 18:05:47.657268 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:47.657164 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-874ff48d-4pn4f" event={"ID":"53ed2eb6-0c4c-434b-ab08-e950a4695d38","Type":"ContainerStarted","Data":"10112068b2eb9d6a4f015d048c43c22c4d1d7ebac40838c973708ece5bc14f47"} Apr 23 18:05:47.657699 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:47.657264 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-874ff48d-4pn4f" Apr 23 18:05:47.673074 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:47.673020 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-874ff48d-4pn4f" podStartSLOduration=1.976560209 podStartE2EDuration="2.673000561s" podCreationTimestamp="2026-04-23 18:05:45 +0000 UTC" firstStartedPulling="2026-04-23 18:05:46.500587031 +0000 UTC m=+430.734328925" lastFinishedPulling="2026-04-23 18:05:47.197027372 +0000 UTC m=+431.430769277" observedRunningTime="2026-04-23 18:05:47.672033467 +0000 UTC m=+431.905775384" watchObservedRunningTime="2026-04-23 18:05:47.673000561 +0000 UTC m=+431.906742478" Apr 23 18:05:48.357368 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:05:48.357324 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae2a2be5-b4da-4fce-b6db-26318c3219ff" path="/var/lib/kubelet/pods/ae2a2be5-b4da-4fce-b6db-26318c3219ff/volumes" Apr 23 18:06:18.666774 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:18.666738 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-874ff48d-4pn4f" Apr 23 18:06:20.253832 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:20.253794 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-lhlk7"] Apr 23 18:06:20.254236 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:20.254176 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae2a2be5-b4da-4fce-b6db-26318c3219ff" containerName="manager" Apr 23 18:06:20.254236 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:20.254188 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae2a2be5-b4da-4fce-b6db-26318c3219ff" containerName="manager" Apr 23 18:06:20.254308 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:20.254244 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae2a2be5-b4da-4fce-b6db-26318c3219ff" containerName="manager" Apr 23 18:06:20.257602 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:20.257584 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-lhlk7" Apr 23 18:06:20.259589 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:20.259564 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-72d9s\"" Apr 23 18:06:20.259589 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:20.259581 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 23 18:06:20.265701 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:20.265677 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-lhlk7"] Apr 23 18:06:20.281954 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:20.281916 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zp8v\" (UniqueName: \"kubernetes.io/projected/c873385c-79c2-484a-9276-a053d3fe4743-kube-api-access-9zp8v\") pod \"odh-model-controller-696fc77849-lhlk7\" (UID: \"c873385c-79c2-484a-9276-a053d3fe4743\") " pod="kserve/odh-model-controller-696fc77849-lhlk7" Apr 23 18:06:20.281954 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:20.281957 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c873385c-79c2-484a-9276-a053d3fe4743-cert\") pod \"odh-model-controller-696fc77849-lhlk7\" (UID: \"c873385c-79c2-484a-9276-a053d3fe4743\") " pod="kserve/odh-model-controller-696fc77849-lhlk7" Apr 23 18:06:20.383408 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:20.383367 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zp8v\" (UniqueName: \"kubernetes.io/projected/c873385c-79c2-484a-9276-a053d3fe4743-kube-api-access-9zp8v\") pod \"odh-model-controller-696fc77849-lhlk7\" (UID: \"c873385c-79c2-484a-9276-a053d3fe4743\") " pod="kserve/odh-model-controller-696fc77849-lhlk7" Apr 23 18:06:20.383408 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:20.383402 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c873385c-79c2-484a-9276-a053d3fe4743-cert\") pod \"odh-model-controller-696fc77849-lhlk7\" (UID: \"c873385c-79c2-484a-9276-a053d3fe4743\") " pod="kserve/odh-model-controller-696fc77849-lhlk7" Apr 23 18:06:20.383709 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:06:20.383614 2578 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 23 18:06:20.383709 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:06:20.383694 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c873385c-79c2-484a-9276-a053d3fe4743-cert podName:c873385c-79c2-484a-9276-a053d3fe4743 nodeName:}" failed. No retries permitted until 2026-04-23 18:06:20.883675316 +0000 UTC m=+465.117417214 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c873385c-79c2-484a-9276-a053d3fe4743-cert") pod "odh-model-controller-696fc77849-lhlk7" (UID: "c873385c-79c2-484a-9276-a053d3fe4743") : secret "odh-model-controller-webhook-cert" not found Apr 23 18:06:20.398980 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:20.398949 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zp8v\" (UniqueName: \"kubernetes.io/projected/c873385c-79c2-484a-9276-a053d3fe4743-kube-api-access-9zp8v\") pod \"odh-model-controller-696fc77849-lhlk7\" (UID: \"c873385c-79c2-484a-9276-a053d3fe4743\") " pod="kserve/odh-model-controller-696fc77849-lhlk7" Apr 23 18:06:20.887795 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:20.887756 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c873385c-79c2-484a-9276-a053d3fe4743-cert\") pod \"odh-model-controller-696fc77849-lhlk7\" (UID: \"c873385c-79c2-484a-9276-a053d3fe4743\") " pod="kserve/odh-model-controller-696fc77849-lhlk7" Apr 23 18:06:20.887982 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:06:20.887912 2578 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 23 18:06:20.888040 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:06:20.887987 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c873385c-79c2-484a-9276-a053d3fe4743-cert podName:c873385c-79c2-484a-9276-a053d3fe4743 nodeName:}" failed. No retries permitted until 2026-04-23 18:06:21.887968624 +0000 UTC m=+466.121710518 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c873385c-79c2-484a-9276-a053d3fe4743-cert") pod "odh-model-controller-696fc77849-lhlk7" (UID: "c873385c-79c2-484a-9276-a053d3fe4743") : secret "odh-model-controller-webhook-cert" not found Apr 23 18:06:21.897202 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:21.897167 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c873385c-79c2-484a-9276-a053d3fe4743-cert\") pod \"odh-model-controller-696fc77849-lhlk7\" (UID: \"c873385c-79c2-484a-9276-a053d3fe4743\") " pod="kserve/odh-model-controller-696fc77849-lhlk7" Apr 23 18:06:21.899504 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:21.899480 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c873385c-79c2-484a-9276-a053d3fe4743-cert\") pod \"odh-model-controller-696fc77849-lhlk7\" (UID: \"c873385c-79c2-484a-9276-a053d3fe4743\") " pod="kserve/odh-model-controller-696fc77849-lhlk7" Apr 23 18:06:22.069097 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:22.069060 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-lhlk7" Apr 23 18:06:22.188205 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:22.188108 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-lhlk7"] Apr 23 18:06:22.191859 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:06:22.191823 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc873385c_79c2_484a_9276_a053d3fe4743.slice/crio-226a2c738ac98134e6bc7b64dbeb6ce7b6da38efb391fd01ddd4dff1d8b71fa2 WatchSource:0}: Error finding container 226a2c738ac98134e6bc7b64dbeb6ce7b6da38efb391fd01ddd4dff1d8b71fa2: Status 404 returned error can't find the container with id 226a2c738ac98134e6bc7b64dbeb6ce7b6da38efb391fd01ddd4dff1d8b71fa2 Apr 23 18:06:22.764354 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:22.764320 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-lhlk7" event={"ID":"c873385c-79c2-484a-9276-a053d3fe4743","Type":"ContainerStarted","Data":"226a2c738ac98134e6bc7b64dbeb6ce7b6da38efb391fd01ddd4dff1d8b71fa2"} Apr 23 18:06:25.776389 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:25.776348 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-lhlk7" event={"ID":"c873385c-79c2-484a-9276-a053d3fe4743","Type":"ContainerStarted","Data":"6015500cc9a4b29f7ae72fcae40a8b47cf52df411d516a285e95b4ecc03cac41"} Apr 23 18:06:25.776760 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:25.776485 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-lhlk7" Apr 23 18:06:25.793178 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:25.793133 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-lhlk7" podStartSLOduration=3.055363852 podStartE2EDuration="5.793119126s" podCreationTimestamp="2026-04-23 18:06:20 +0000 UTC" firstStartedPulling="2026-04-23 18:06:22.193188823 +0000 UTC m=+466.426930731" lastFinishedPulling="2026-04-23 18:06:24.930944111 +0000 UTC m=+469.164686005" observedRunningTime="2026-04-23 18:06:25.792490455 +0000 UTC m=+470.026232381" watchObservedRunningTime="2026-04-23 18:06:25.793119126 +0000 UTC m=+470.026861043" Apr 23 18:06:36.781915 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:36.781886 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-lhlk7" Apr 23 18:06:37.617938 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:37.617902 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-mls9c"] Apr 23 18:06:37.622276 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:37.622261 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-mls9c" Apr 23 18:06:37.624183 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:37.624160 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 23 18:06:37.624288 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:37.624241 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-fr69r\"" Apr 23 18:06:37.626796 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:37.626773 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-mls9c"] Apr 23 18:06:37.735128 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:37.735097 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzwns\" (UniqueName: \"kubernetes.io/projected/d1d0fa94-8970-4c49-9bad-c7a1b218c2a0-kube-api-access-mzwns\") pod \"s3-init-mls9c\" (UID: \"d1d0fa94-8970-4c49-9bad-c7a1b218c2a0\") " pod="kserve/s3-init-mls9c" Apr 23 18:06:37.836078 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:37.836048 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzwns\" (UniqueName: \"kubernetes.io/projected/d1d0fa94-8970-4c49-9bad-c7a1b218c2a0-kube-api-access-mzwns\") pod \"s3-init-mls9c\" (UID: \"d1d0fa94-8970-4c49-9bad-c7a1b218c2a0\") " pod="kserve/s3-init-mls9c" Apr 23 18:06:37.844010 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:37.843983 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzwns\" (UniqueName: \"kubernetes.io/projected/d1d0fa94-8970-4c49-9bad-c7a1b218c2a0-kube-api-access-mzwns\") pod \"s3-init-mls9c\" (UID: \"d1d0fa94-8970-4c49-9bad-c7a1b218c2a0\") " pod="kserve/s3-init-mls9c" Apr 23 18:06:37.944706 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:37.944632 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-mls9c" Apr 23 18:06:38.062338 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:38.062311 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-mls9c"] Apr 23 18:06:38.064796 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:06:38.064766 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1d0fa94_8970_4c49_9bad_c7a1b218c2a0.slice/crio-5bf44d48cddcfee324366b8b8bd249f823329765b098d10ca584be4e18aec892 WatchSource:0}: Error finding container 5bf44d48cddcfee324366b8b8bd249f823329765b098d10ca584be4e18aec892: Status 404 returned error can't find the container with id 5bf44d48cddcfee324366b8b8bd249f823329765b098d10ca584be4e18aec892 Apr 23 18:06:38.818211 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:38.817788 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-mls9c" event={"ID":"d1d0fa94-8970-4c49-9bad-c7a1b218c2a0","Type":"ContainerStarted","Data":"5bf44d48cddcfee324366b8b8bd249f823329765b098d10ca584be4e18aec892"} Apr 23 18:06:42.834275 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:42.834185 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-mls9c" event={"ID":"d1d0fa94-8970-4c49-9bad-c7a1b218c2a0","Type":"ContainerStarted","Data":"2d18dad55148f76200b253839d65057df0eff8950222d64714aa3c5eaf9a7457"} Apr 23 18:06:42.850656 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:42.850601 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-mls9c" podStartSLOduration=1.396905601 podStartE2EDuration="5.850583832s" podCreationTimestamp="2026-04-23 18:06:37 +0000 UTC" firstStartedPulling="2026-04-23 18:06:38.067201772 +0000 UTC m=+482.300943669" lastFinishedPulling="2026-04-23 18:06:42.520880002 +0000 UTC m=+486.754621900" observedRunningTime="2026-04-23 18:06:42.848498761 +0000 UTC m=+487.082240678" watchObservedRunningTime="2026-04-23 18:06:42.850583832 +0000 UTC m=+487.084325748" Apr 23 18:06:45.848374 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:45.848335 2578 generic.go:358] "Generic (PLEG): container finished" podID="d1d0fa94-8970-4c49-9bad-c7a1b218c2a0" containerID="2d18dad55148f76200b253839d65057df0eff8950222d64714aa3c5eaf9a7457" exitCode=0 Apr 23 18:06:45.848781 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:45.848413 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-mls9c" event={"ID":"d1d0fa94-8970-4c49-9bad-c7a1b218c2a0","Type":"ContainerDied","Data":"2d18dad55148f76200b253839d65057df0eff8950222d64714aa3c5eaf9a7457"} Apr 23 18:06:46.970049 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:46.970027 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-mls9c" Apr 23 18:06:47.122049 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:47.121961 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzwns\" (UniqueName: \"kubernetes.io/projected/d1d0fa94-8970-4c49-9bad-c7a1b218c2a0-kube-api-access-mzwns\") pod \"d1d0fa94-8970-4c49-9bad-c7a1b218c2a0\" (UID: \"d1d0fa94-8970-4c49-9bad-c7a1b218c2a0\") " Apr 23 18:06:47.123988 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:47.123955 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1d0fa94-8970-4c49-9bad-c7a1b218c2a0-kube-api-access-mzwns" (OuterVolumeSpecName: "kube-api-access-mzwns") pod "d1d0fa94-8970-4c49-9bad-c7a1b218c2a0" (UID: "d1d0fa94-8970-4c49-9bad-c7a1b218c2a0"). InnerVolumeSpecName "kube-api-access-mzwns". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:06:47.223077 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:47.223045 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mzwns\" (UniqueName: \"kubernetes.io/projected/d1d0fa94-8970-4c49-9bad-c7a1b218c2a0-kube-api-access-mzwns\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:06:47.855964 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:47.855935 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-mls9c" Apr 23 18:06:47.856136 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:47.855933 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-mls9c" event={"ID":"d1d0fa94-8970-4c49-9bad-c7a1b218c2a0","Type":"ContainerDied","Data":"5bf44d48cddcfee324366b8b8bd249f823329765b098d10ca584be4e18aec892"} Apr 23 18:06:47.856136 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:47.856041 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bf44d48cddcfee324366b8b8bd249f823329765b098d10ca584be4e18aec892" Apr 23 18:06:48.598707 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:48.598675 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-4tscx"] Apr 23 18:06:48.599064 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:48.599019 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1d0fa94-8970-4c49-9bad-c7a1b218c2a0" containerName="s3-init" Apr 23 18:06:48.599064 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:48.599035 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1d0fa94-8970-4c49-9bad-c7a1b218c2a0" containerName="s3-init" Apr 23 18:06:48.599138 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:48.599085 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1d0fa94-8970-4c49-9bad-c7a1b218c2a0" containerName="s3-init" Apr 23 18:06:48.602337 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:48.602320 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4tscx" Apr 23 18:06:48.604275 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:48.604258 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-fr69r\"" Apr 23 18:06:48.604349 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:48.604281 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 23 18:06:48.608295 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:48.608270 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-4tscx"] Apr 23 18:06:48.734896 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:48.734860 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/eaeb43af-176c-41ba-9736-abf4c677602a-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-4tscx\" (UID: \"eaeb43af-176c-41ba-9736-abf4c677602a\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4tscx" Apr 23 18:06:48.735098 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:48.734915 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxrnc\" (UniqueName: \"kubernetes.io/projected/eaeb43af-176c-41ba-9736-abf4c677602a-kube-api-access-xxrnc\") pod \"seaweedfs-tls-custom-ddd4dbfd-4tscx\" (UID: \"eaeb43af-176c-41ba-9736-abf4c677602a\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4tscx" Apr 23 18:06:48.836207 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:48.836177 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/eaeb43af-176c-41ba-9736-abf4c677602a-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-4tscx\" (UID: \"eaeb43af-176c-41ba-9736-abf4c677602a\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4tscx" Apr 23 18:06:48.836381 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:48.836224 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxrnc\" (UniqueName: \"kubernetes.io/projected/eaeb43af-176c-41ba-9736-abf4c677602a-kube-api-access-xxrnc\") pod \"seaweedfs-tls-custom-ddd4dbfd-4tscx\" (UID: \"eaeb43af-176c-41ba-9736-abf4c677602a\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4tscx" Apr 23 18:06:48.836646 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:48.836621 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/eaeb43af-176c-41ba-9736-abf4c677602a-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-4tscx\" (UID: \"eaeb43af-176c-41ba-9736-abf4c677602a\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4tscx" Apr 23 18:06:48.844418 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:48.844388 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxrnc\" (UniqueName: \"kubernetes.io/projected/eaeb43af-176c-41ba-9736-abf4c677602a-kube-api-access-xxrnc\") pod \"seaweedfs-tls-custom-ddd4dbfd-4tscx\" (UID: \"eaeb43af-176c-41ba-9736-abf4c677602a\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4tscx" Apr 23 18:06:48.912059 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:48.911984 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4tscx" Apr 23 18:06:49.027289 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:49.027202 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-4tscx"] Apr 23 18:06:49.030100 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:06:49.030072 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaeb43af_176c_41ba_9736_abf4c677602a.slice/crio-61c2240bbf8dab80772a0c1a704c8b96c665c98f282daa0bba0befe760f24f65 WatchSource:0}: Error finding container 61c2240bbf8dab80772a0c1a704c8b96c665c98f282daa0bba0befe760f24f65: Status 404 returned error can't find the container with id 61c2240bbf8dab80772a0c1a704c8b96c665c98f282daa0bba0befe760f24f65 Apr 23 18:06:49.862549 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:49.862498 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4tscx" event={"ID":"eaeb43af-176c-41ba-9736-abf4c677602a","Type":"ContainerStarted","Data":"61c2240bbf8dab80772a0c1a704c8b96c665c98f282daa0bba0befe760f24f65"} Apr 23 18:06:51.869550 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:51.869449 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4tscx" event={"ID":"eaeb43af-176c-41ba-9736-abf4c677602a","Type":"ContainerStarted","Data":"28efb63616a4c84f08586f2f2b1392a4d6ffd1b8e5b93bbef02949bde0292d70"} Apr 23 18:06:51.885353 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:51.885311 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4tscx" podStartSLOduration=1.372148626 podStartE2EDuration="3.885297811s" podCreationTimestamp="2026-04-23 18:06:48 +0000 UTC" firstStartedPulling="2026-04-23 18:06:49.031438415 +0000 UTC m=+493.265180309" lastFinishedPulling="2026-04-23 18:06:51.544587597 +0000 UTC m=+495.778329494" observedRunningTime="2026-04-23 18:06:51.884523172 +0000 UTC m=+496.118265087" watchObservedRunningTime="2026-04-23 18:06:51.885297811 +0000 UTC m=+496.119039727" Apr 23 18:06:52.666769 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:52.666728 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-4tscx"] Apr 23 18:06:53.875898 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:06:53.875834 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4tscx" podUID="eaeb43af-176c-41ba-9736-abf4c677602a" containerName="seaweedfs-tls-custom" containerID="cri-o://28efb63616a4c84f08586f2f2b1392a4d6ffd1b8e5b93bbef02949bde0292d70" gracePeriod=30 Apr 23 18:07:22.725427 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:22.725405 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4tscx" Apr 23 18:07:22.809546 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:22.809501 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/eaeb43af-176c-41ba-9736-abf4c677602a-data\") pod \"eaeb43af-176c-41ba-9736-abf4c677602a\" (UID: \"eaeb43af-176c-41ba-9736-abf4c677602a\") " Apr 23 18:07:22.809703 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:22.809573 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxrnc\" (UniqueName: \"kubernetes.io/projected/eaeb43af-176c-41ba-9736-abf4c677602a-kube-api-access-xxrnc\") pod \"eaeb43af-176c-41ba-9736-abf4c677602a\" (UID: \"eaeb43af-176c-41ba-9736-abf4c677602a\") " Apr 23 18:07:22.810753 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:22.810722 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaeb43af-176c-41ba-9736-abf4c677602a-data" (OuterVolumeSpecName: "data") pod "eaeb43af-176c-41ba-9736-abf4c677602a" (UID: "eaeb43af-176c-41ba-9736-abf4c677602a"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:07:22.811509 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:22.811485 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaeb43af-176c-41ba-9736-abf4c677602a-kube-api-access-xxrnc" (OuterVolumeSpecName: "kube-api-access-xxrnc") pod "eaeb43af-176c-41ba-9736-abf4c677602a" (UID: "eaeb43af-176c-41ba-9736-abf4c677602a"). InnerVolumeSpecName "kube-api-access-xxrnc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:07:22.910514 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:22.910445 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xxrnc\" (UniqueName: \"kubernetes.io/projected/eaeb43af-176c-41ba-9736-abf4c677602a-kube-api-access-xxrnc\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:07:22.910514 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:22.910468 2578 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/eaeb43af-176c-41ba-9736-abf4c677602a-data\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:07:22.958873 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:22.958846 2578 generic.go:358] "Generic (PLEG): container finished" podID="eaeb43af-176c-41ba-9736-abf4c677602a" containerID="28efb63616a4c84f08586f2f2b1392a4d6ffd1b8e5b93bbef02949bde0292d70" exitCode=0 Apr 23 18:07:22.959015 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:22.958887 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4tscx" event={"ID":"eaeb43af-176c-41ba-9736-abf4c677602a","Type":"ContainerDied","Data":"28efb63616a4c84f08586f2f2b1392a4d6ffd1b8e5b93bbef02949bde0292d70"} Apr 23 18:07:22.959015 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:22.958898 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4tscx" Apr 23 18:07:22.959015 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:22.958909 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4tscx" event={"ID":"eaeb43af-176c-41ba-9736-abf4c677602a","Type":"ContainerDied","Data":"61c2240bbf8dab80772a0c1a704c8b96c665c98f282daa0bba0befe760f24f65"} Apr 23 18:07:22.959015 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:22.958939 2578 scope.go:117] "RemoveContainer" containerID="28efb63616a4c84f08586f2f2b1392a4d6ffd1b8e5b93bbef02949bde0292d70" Apr 23 18:07:22.968412 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:22.968392 2578 scope.go:117] "RemoveContainer" containerID="28efb63616a4c84f08586f2f2b1392a4d6ffd1b8e5b93bbef02949bde0292d70" Apr 23 18:07:22.968879 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:07:22.968859 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28efb63616a4c84f08586f2f2b1392a4d6ffd1b8e5b93bbef02949bde0292d70\": container with ID starting with 28efb63616a4c84f08586f2f2b1392a4d6ffd1b8e5b93bbef02949bde0292d70 not found: ID does not exist" containerID="28efb63616a4c84f08586f2f2b1392a4d6ffd1b8e5b93bbef02949bde0292d70" Apr 23 18:07:22.968953 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:22.968886 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28efb63616a4c84f08586f2f2b1392a4d6ffd1b8e5b93bbef02949bde0292d70"} err="failed to get container status \"28efb63616a4c84f08586f2f2b1392a4d6ffd1b8e5b93bbef02949bde0292d70\": rpc error: code = NotFound desc = could not find container \"28efb63616a4c84f08586f2f2b1392a4d6ffd1b8e5b93bbef02949bde0292d70\": container with ID starting with 28efb63616a4c84f08586f2f2b1392a4d6ffd1b8e5b93bbef02949bde0292d70 not found: ID does not exist" Apr 23 18:07:22.978854 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:22.978831 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-4tscx"] Apr 23 18:07:22.984111 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:22.984092 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-4tscx"] Apr 23 18:07:24.362736 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:24.362696 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaeb43af-176c-41ba-9736-abf4c677602a" path="/var/lib/kubelet/pods/eaeb43af-176c-41ba-9736-abf4c677602a/volumes" Apr 23 18:07:24.476178 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:24.476142 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-svf87"] Apr 23 18:07:24.476439 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:24.476427 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eaeb43af-176c-41ba-9736-abf4c677602a" containerName="seaweedfs-tls-custom" Apr 23 18:07:24.476480 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:24.476441 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaeb43af-176c-41ba-9736-abf4c677602a" containerName="seaweedfs-tls-custom" Apr 23 18:07:24.476511 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:24.476503 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="eaeb43af-176c-41ba-9736-abf4c677602a" containerName="seaweedfs-tls-custom" Apr 23 18:07:24.481487 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:24.481470 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-svf87" Apr 23 18:07:24.483416 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:24.483390 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 23 18:07:24.483529 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:24.483470 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-fr69r\"" Apr 23 18:07:24.486573 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:24.486501 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-svf87"] Apr 23 18:07:24.625665 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:24.625571 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9jrm\" (UniqueName: \"kubernetes.io/projected/cac62d37-3ae2-41f0-b2ef-81e680878bd4-kube-api-access-w9jrm\") pod \"s3-tls-init-custom-svf87\" (UID: \"cac62d37-3ae2-41f0-b2ef-81e680878bd4\") " pod="kserve/s3-tls-init-custom-svf87" Apr 23 18:07:24.726722 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:24.726688 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9jrm\" (UniqueName: \"kubernetes.io/projected/cac62d37-3ae2-41f0-b2ef-81e680878bd4-kube-api-access-w9jrm\") pod \"s3-tls-init-custom-svf87\" (UID: \"cac62d37-3ae2-41f0-b2ef-81e680878bd4\") " pod="kserve/s3-tls-init-custom-svf87" Apr 23 18:07:24.735238 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:24.735206 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9jrm\" (UniqueName: \"kubernetes.io/projected/cac62d37-3ae2-41f0-b2ef-81e680878bd4-kube-api-access-w9jrm\") pod \"s3-tls-init-custom-svf87\" (UID: \"cac62d37-3ae2-41f0-b2ef-81e680878bd4\") " pod="kserve/s3-tls-init-custom-svf87" Apr 23 18:07:24.802993 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:24.802956 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-svf87" Apr 23 18:07:24.920809 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:24.920737 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-svf87"] Apr 23 18:07:24.923557 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:07:24.923514 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcac62d37_3ae2_41f0_b2ef_81e680878bd4.slice/crio-807652c216153c1a2332bb316cb80a9030b37617a3820fdbb2e87262dd9a6400 WatchSource:0}: Error finding container 807652c216153c1a2332bb316cb80a9030b37617a3820fdbb2e87262dd9a6400: Status 404 returned error can't find the container with id 807652c216153c1a2332bb316cb80a9030b37617a3820fdbb2e87262dd9a6400 Apr 23 18:07:24.969672 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:24.969641 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-svf87" event={"ID":"cac62d37-3ae2-41f0-b2ef-81e680878bd4","Type":"ContainerStarted","Data":"807652c216153c1a2332bb316cb80a9030b37617a3820fdbb2e87262dd9a6400"} Apr 23 18:07:25.973792 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:25.973753 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-svf87" event={"ID":"cac62d37-3ae2-41f0-b2ef-81e680878bd4","Type":"ContainerStarted","Data":"55d43b54a658650edae3e7ae32fea41a40e12cda40fa11f426a994817ede6b81"} Apr 23 18:07:25.989528 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:25.989372 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-svf87" podStartSLOduration=1.989357516 podStartE2EDuration="1.989357516s" podCreationTimestamp="2026-04-23 18:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:07:25.989211418 +0000 UTC m=+530.222953346" watchObservedRunningTime="2026-04-23 18:07:25.989357516 +0000 UTC m=+530.223099433" Apr 23 18:07:29.987226 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:29.987186 2578 generic.go:358] "Generic (PLEG): container finished" podID="cac62d37-3ae2-41f0-b2ef-81e680878bd4" containerID="55d43b54a658650edae3e7ae32fea41a40e12cda40fa11f426a994817ede6b81" exitCode=0 Apr 23 18:07:29.987640 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:29.987259 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-svf87" event={"ID":"cac62d37-3ae2-41f0-b2ef-81e680878bd4","Type":"ContainerDied","Data":"55d43b54a658650edae3e7ae32fea41a40e12cda40fa11f426a994817ede6b81"} Apr 23 18:07:31.114294 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:31.114264 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-svf87" Apr 23 18:07:31.178294 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:31.178266 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9jrm\" (UniqueName: \"kubernetes.io/projected/cac62d37-3ae2-41f0-b2ef-81e680878bd4-kube-api-access-w9jrm\") pod \"cac62d37-3ae2-41f0-b2ef-81e680878bd4\" (UID: \"cac62d37-3ae2-41f0-b2ef-81e680878bd4\") " Apr 23 18:07:31.180285 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:31.180258 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cac62d37-3ae2-41f0-b2ef-81e680878bd4-kube-api-access-w9jrm" (OuterVolumeSpecName: "kube-api-access-w9jrm") pod "cac62d37-3ae2-41f0-b2ef-81e680878bd4" (UID: "cac62d37-3ae2-41f0-b2ef-81e680878bd4"). InnerVolumeSpecName "kube-api-access-w9jrm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:07:31.279492 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:31.279454 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w9jrm\" (UniqueName: \"kubernetes.io/projected/cac62d37-3ae2-41f0-b2ef-81e680878bd4-kube-api-access-w9jrm\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:07:31.993846 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:31.993816 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-svf87" Apr 23 18:07:31.993846 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:31.993823 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-svf87" event={"ID":"cac62d37-3ae2-41f0-b2ef-81e680878bd4","Type":"ContainerDied","Data":"807652c216153c1a2332bb316cb80a9030b37617a3820fdbb2e87262dd9a6400"} Apr 23 18:07:31.994044 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:31.993855 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="807652c216153c1a2332bb316cb80a9030b37617a3820fdbb2e87262dd9a6400" Apr 23 18:07:32.575928 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:32.575896 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-cqtwd"] Apr 23 18:07:32.576365 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:32.576194 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cac62d37-3ae2-41f0-b2ef-81e680878bd4" containerName="s3-tls-init-custom" Apr 23 18:07:32.576365 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:32.576206 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac62d37-3ae2-41f0-b2ef-81e680878bd4" containerName="s3-tls-init-custom" Apr 23 18:07:32.576365 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:32.576277 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="cac62d37-3ae2-41f0-b2ef-81e680878bd4" containerName="s3-tls-init-custom" Apr 23 18:07:32.578953 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:32.578936 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-cqtwd" Apr 23 18:07:32.581068 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:32.581042 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-fr69r\"" Apr 23 18:07:32.581068 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:32.581063 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 23 18:07:32.581236 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:32.581063 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 23 18:07:32.586261 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:32.585965 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-cqtwd"] Apr 23 18:07:32.691629 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:32.691596 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf8pd\" (UniqueName: \"kubernetes.io/projected/16fe4703-96ab-4ece-9f48-0f51e78658ad-kube-api-access-wf8pd\") pod \"seaweedfs-tls-serving-7fd5766db9-cqtwd\" (UID: \"16fe4703-96ab-4ece-9f48-0f51e78658ad\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-cqtwd" Apr 23 18:07:32.691804 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:32.691636 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/16fe4703-96ab-4ece-9f48-0f51e78658ad-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-cqtwd\" (UID: \"16fe4703-96ab-4ece-9f48-0f51e78658ad\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-cqtwd" Apr 23 18:07:32.691804 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:32.691736 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/16fe4703-96ab-4ece-9f48-0f51e78658ad-data\") pod \"seaweedfs-tls-serving-7fd5766db9-cqtwd\" (UID: \"16fe4703-96ab-4ece-9f48-0f51e78658ad\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-cqtwd" Apr 23 18:07:32.792880 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:32.792849 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/16fe4703-96ab-4ece-9f48-0f51e78658ad-data\") pod \"seaweedfs-tls-serving-7fd5766db9-cqtwd\" (UID: \"16fe4703-96ab-4ece-9f48-0f51e78658ad\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-cqtwd" Apr 23 18:07:32.792880 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:32.792886 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wf8pd\" (UniqueName: \"kubernetes.io/projected/16fe4703-96ab-4ece-9f48-0f51e78658ad-kube-api-access-wf8pd\") pod \"seaweedfs-tls-serving-7fd5766db9-cqtwd\" (UID: \"16fe4703-96ab-4ece-9f48-0f51e78658ad\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-cqtwd" Apr 23 18:07:32.793112 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:32.792903 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/16fe4703-96ab-4ece-9f48-0f51e78658ad-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-cqtwd\" (UID: \"16fe4703-96ab-4ece-9f48-0f51e78658ad\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-cqtwd" Apr 23 18:07:32.793112 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:07:32.792991 2578 projected.go:264] Couldn't get secret kserve/seaweedfs-tls-serving: secret "seaweedfs-tls-serving" not found Apr 23 18:07:32.793112 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:07:32.793002 2578 projected.go:194] Error preparing data for projected volume seaweedfs-tls-serving for pod kserve/seaweedfs-tls-serving-7fd5766db9-cqtwd: secret "seaweedfs-tls-serving" not found Apr 23 18:07:32.793112 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:07:32.793059 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/16fe4703-96ab-4ece-9f48-0f51e78658ad-seaweedfs-tls-serving podName:16fe4703-96ab-4ece-9f48-0f51e78658ad nodeName:}" failed. No retries permitted until 2026-04-23 18:07:33.293044728 +0000 UTC m=+537.526786622 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "seaweedfs-tls-serving" (UniqueName: "kubernetes.io/projected/16fe4703-96ab-4ece-9f48-0f51e78658ad-seaweedfs-tls-serving") pod "seaweedfs-tls-serving-7fd5766db9-cqtwd" (UID: "16fe4703-96ab-4ece-9f48-0f51e78658ad") : secret "seaweedfs-tls-serving" not found Apr 23 18:07:32.793318 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:32.793214 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/16fe4703-96ab-4ece-9f48-0f51e78658ad-data\") pod \"seaweedfs-tls-serving-7fd5766db9-cqtwd\" (UID: \"16fe4703-96ab-4ece-9f48-0f51e78658ad\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-cqtwd" Apr 23 18:07:32.802931 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:32.802908 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf8pd\" (UniqueName: \"kubernetes.io/projected/16fe4703-96ab-4ece-9f48-0f51e78658ad-kube-api-access-wf8pd\") pod \"seaweedfs-tls-serving-7fd5766db9-cqtwd\" (UID: \"16fe4703-96ab-4ece-9f48-0f51e78658ad\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-cqtwd" Apr 23 18:07:33.297074 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:33.297034 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/16fe4703-96ab-4ece-9f48-0f51e78658ad-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-cqtwd\" (UID: \"16fe4703-96ab-4ece-9f48-0f51e78658ad\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-cqtwd" Apr 23 18:07:33.299341 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:33.299321 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/16fe4703-96ab-4ece-9f48-0f51e78658ad-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-cqtwd\" (UID: \"16fe4703-96ab-4ece-9f48-0f51e78658ad\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-cqtwd" Apr 23 18:07:33.488935 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:33.488901 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-cqtwd" Apr 23 18:07:33.606482 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:33.606449 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-cqtwd"] Apr 23 18:07:33.609498 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:07:33.609476 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16fe4703_96ab_4ece_9f48_0f51e78658ad.slice/crio-3d8ab2f96fd7db166600ee98823383c30c465fead06bb3659269fd95d9da6d1d WatchSource:0}: Error finding container 3d8ab2f96fd7db166600ee98823383c30c465fead06bb3659269fd95d9da6d1d: Status 404 returned error can't find the container with id 3d8ab2f96fd7db166600ee98823383c30c465fead06bb3659269fd95d9da6d1d Apr 23 18:07:34.001261 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:34.001223 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-cqtwd" event={"ID":"16fe4703-96ab-4ece-9f48-0f51e78658ad","Type":"ContainerStarted","Data":"346642868b3a73817cc513dfbbae49ee68622653fd78526e747ae9964e97448f"} Apr 23 18:07:34.001261 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:34.001266 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-cqtwd" event={"ID":"16fe4703-96ab-4ece-9f48-0f51e78658ad","Type":"ContainerStarted","Data":"3d8ab2f96fd7db166600ee98823383c30c465fead06bb3659269fd95d9da6d1d"} Apr 23 18:07:34.017896 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:07:34.017852 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-cqtwd" podStartSLOduration=1.770189739 podStartE2EDuration="2.017834435s" podCreationTimestamp="2026-04-23 18:07:32 +0000 UTC" firstStartedPulling="2026-04-23 18:07:33.61112901 +0000 UTC m=+537.844870907" lastFinishedPulling="2026-04-23 18:07:33.858773703 +0000 UTC m=+538.092515603" observedRunningTime="2026-04-23 18:07:34.016886627 +0000 UTC m=+538.250628564" watchObservedRunningTime="2026-04-23 18:07:34.017834435 +0000 UTC m=+538.251576353" Apr 23 18:08:36.253010 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:08:36.252981 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:08:36.253497 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:08:36.253389 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:08:36.259206 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:08:36.259184 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:08:36.259836 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:08:36.259818 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:13:36.274063 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:13:36.273986 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:13:36.275063 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:13:36.275040 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:13:36.280356 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:13:36.280337 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:13:36.281321 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:13:36.281301 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:18:36.295281 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:18:36.295249 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:18:36.298029 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:18:36.298009 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:18:36.301421 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:18:36.301404 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:18:36.304137 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:18:36.304119 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:21:22.861425 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:22.861387 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5"] Apr 23 18:21:22.865623 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:22.865598 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" Apr 23 18:21:22.868936 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:22.868906 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:21:22.868936 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:22.868929 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-predictor-serving-cert\"" Apr 23 18:21:22.869140 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:22.868988 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 18:21:22.869140 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:22.868919 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t4fg7\"" Apr 23 18:21:22.869263 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:22.869240 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-kube-rbac-proxy-sar-config\"" Apr 23 18:21:22.879618 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:22.879589 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5"] Apr 23 18:21:22.923336 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:22.923289 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-7fcj5\" (UID: \"a0b542d7-07d7-4bf5-abc6-79360cecf5d7\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" Apr 23 18:21:22.923336 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:22.923336 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-7fcj5\" (UID: \"a0b542d7-07d7-4bf5-abc6-79360cecf5d7\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" Apr 23 18:21:22.923594 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:22.923367 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-7fcj5\" (UID: \"a0b542d7-07d7-4bf5-abc6-79360cecf5d7\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" Apr 23 18:21:22.923594 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:22.923402 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g26x7\" (UniqueName: \"kubernetes.io/projected/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-kube-api-access-g26x7\") pod \"isvc-paddle-predictor-6b8b7cfb4b-7fcj5\" (UID: \"a0b542d7-07d7-4bf5-abc6-79360cecf5d7\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" Apr 23 18:21:23.024710 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:23.024669 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-7fcj5\" (UID: \"a0b542d7-07d7-4bf5-abc6-79360cecf5d7\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" Apr 23 18:21:23.024710 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:23.024712 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-7fcj5\" (UID: \"a0b542d7-07d7-4bf5-abc6-79360cecf5d7\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" Apr 23 18:21:23.024976 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:23.024748 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g26x7\" (UniqueName: \"kubernetes.io/projected/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-kube-api-access-g26x7\") pod \"isvc-paddle-predictor-6b8b7cfb4b-7fcj5\" (UID: \"a0b542d7-07d7-4bf5-abc6-79360cecf5d7\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" Apr 23 18:21:23.024976 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:23.024805 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-7fcj5\" (UID: \"a0b542d7-07d7-4bf5-abc6-79360cecf5d7\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" Apr 23 18:21:23.025163 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:23.025135 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-7fcj5\" (UID: \"a0b542d7-07d7-4bf5-abc6-79360cecf5d7\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" Apr 23 18:21:23.025382 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:23.025363 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-7fcj5\" (UID: \"a0b542d7-07d7-4bf5-abc6-79360cecf5d7\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" Apr 23 18:21:23.027285 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:23.027260 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-7fcj5\" (UID: \"a0b542d7-07d7-4bf5-abc6-79360cecf5d7\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" Apr 23 18:21:23.033869 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:23.033848 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g26x7\" (UniqueName: \"kubernetes.io/projected/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-kube-api-access-g26x7\") pod \"isvc-paddle-predictor-6b8b7cfb4b-7fcj5\" (UID: \"a0b542d7-07d7-4bf5-abc6-79360cecf5d7\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" Apr 23 18:21:23.179604 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:23.179504 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" Apr 23 18:21:23.306832 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:23.306793 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5"] Apr 23 18:21:23.308897 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:21:23.308868 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0b542d7_07d7_4bf5_abc6_79360cecf5d7.slice/crio-5a31d7fd89a79be9eaf5c36e8f1968398bfde19b43e0710666f66e8243c23e33 WatchSource:0}: Error finding container 5a31d7fd89a79be9eaf5c36e8f1968398bfde19b43e0710666f66e8243c23e33: Status 404 returned error can't find the container with id 5a31d7fd89a79be9eaf5c36e8f1968398bfde19b43e0710666f66e8243c23e33 Apr 23 18:21:23.310822 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:23.310802 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:21:23.638710 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:23.638672 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" event={"ID":"a0b542d7-07d7-4bf5-abc6-79360cecf5d7","Type":"ContainerStarted","Data":"5a31d7fd89a79be9eaf5c36e8f1968398bfde19b43e0710666f66e8243c23e33"} Apr 23 18:21:27.654326 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:27.654288 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" event={"ID":"a0b542d7-07d7-4bf5-abc6-79360cecf5d7","Type":"ContainerStarted","Data":"d2a73a4e787a5c14211f2103b29acdcfe0d9fd30afec66d92da51036fde2ad62"} Apr 23 18:21:31.669009 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:31.668921 2578 generic.go:358] "Generic (PLEG): container finished" podID="a0b542d7-07d7-4bf5-abc6-79360cecf5d7" containerID="d2a73a4e787a5c14211f2103b29acdcfe0d9fd30afec66d92da51036fde2ad62" exitCode=0 Apr 23 18:21:31.669009 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:31.668996 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" event={"ID":"a0b542d7-07d7-4bf5-abc6-79360cecf5d7","Type":"ContainerDied","Data":"d2a73a4e787a5c14211f2103b29acdcfe0d9fd30afec66d92da51036fde2ad62"} Apr 23 18:21:42.714932 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:42.714880 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" event={"ID":"a0b542d7-07d7-4bf5-abc6-79360cecf5d7","Type":"ContainerStarted","Data":"a475289fac44be40284ffeaf219b8318fe76f7df110f44efe84e49ba1b11f1a9"} Apr 23 18:21:45.727556 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:45.727513 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" event={"ID":"a0b542d7-07d7-4bf5-abc6-79360cecf5d7","Type":"ContainerStarted","Data":"e15e5804aa959dec27616e8d9351dd006da4c708ab3e899967dfdcbc92c66e29"} Apr 23 18:21:45.727944 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:45.727672 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" Apr 23 18:21:45.746844 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:45.746721 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" podStartSLOduration=1.555971944 podStartE2EDuration="23.746701895s" podCreationTimestamp="2026-04-23 18:21:22 +0000 UTC" firstStartedPulling="2026-04-23 18:21:23.31097545 +0000 UTC m=+1367.544717348" lastFinishedPulling="2026-04-23 18:21:45.501705403 +0000 UTC m=+1389.735447299" observedRunningTime="2026-04-23 18:21:45.746087341 +0000 UTC m=+1389.979829258" watchObservedRunningTime="2026-04-23 18:21:45.746701895 +0000 UTC m=+1389.980443813" Apr 23 18:21:46.730521 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:46.730484 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" Apr 23 18:21:46.731789 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:46.731762 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" podUID="a0b542d7-07d7-4bf5-abc6-79360cecf5d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:21:47.734129 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:47.734082 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" podUID="a0b542d7-07d7-4bf5-abc6-79360cecf5d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:21:52.739094 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:52.739062 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" Apr 23 18:21:52.739711 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:21:52.739664 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" podUID="a0b542d7-07d7-4bf5-abc6-79360cecf5d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:22:02.740121 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:02.740067 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" podUID="a0b542d7-07d7-4bf5-abc6-79360cecf5d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:22:12.739801 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:12.739760 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" podUID="a0b542d7-07d7-4bf5-abc6-79360cecf5d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:22:22.739837 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:22.739794 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" podUID="a0b542d7-07d7-4bf5-abc6-79360cecf5d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 18:22:32.740660 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:32.740579 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" Apr 23 18:22:44.363809 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.363768 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5"] Apr 23 18:22:44.364292 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.364082 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" podUID="a0b542d7-07d7-4bf5-abc6-79360cecf5d7" containerName="kserve-container" containerID="cri-o://a475289fac44be40284ffeaf219b8318fe76f7df110f44efe84e49ba1b11f1a9" gracePeriod=30 Apr 23 18:22:44.364292 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.364129 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" podUID="a0b542d7-07d7-4bf5-abc6-79360cecf5d7" containerName="kube-rbac-proxy" containerID="cri-o://e15e5804aa959dec27616e8d9351dd006da4c708ab3e899967dfdcbc92c66e29" gracePeriod=30 Apr 23 18:22:44.457915 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.457870 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb"] Apr 23 18:22:44.461482 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.461462 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" Apr 23 18:22:44.463239 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.463214 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-kube-rbac-proxy-sar-config\"" Apr 23 18:22:44.463333 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.463218 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-predictor-serving-cert\"" Apr 23 18:22:44.472671 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.472648 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb"] Apr 23 18:22:44.606912 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.606862 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb\" (UID: \"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" Apr 23 18:22:44.607108 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.606926 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb\" (UID: \"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" Apr 23 18:22:44.607108 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.606954 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb\" (UID: \"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" Apr 23 18:22:44.607108 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.607056 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk4wd\" (UniqueName: \"kubernetes.io/projected/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-kube-api-access-jk4wd\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb\" (UID: \"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" Apr 23 18:22:44.707890 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.707784 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb\" (UID: \"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" Apr 23 18:22:44.707890 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.707825 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb\" (UID: \"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" Apr 23 18:22:44.708114 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.707901 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jk4wd\" (UniqueName: \"kubernetes.io/projected/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-kube-api-access-jk4wd\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb\" (UID: \"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" Apr 23 18:22:44.708114 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.707941 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb\" (UID: \"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" Apr 23 18:22:44.708294 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.708269 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb\" (UID: \"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" Apr 23 18:22:44.708616 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.708593 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb\" (UID: \"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" Apr 23 18:22:44.710398 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.710374 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb\" (UID: \"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" Apr 23 18:22:44.715827 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.715796 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk4wd\" (UniqueName: \"kubernetes.io/projected/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-kube-api-access-jk4wd\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb\" (UID: \"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" Apr 23 18:22:44.773169 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.773138 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" Apr 23 18:22:44.899914 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.899880 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb"] Apr 23 18:22:44.903114 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:22:44.903088 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8c5dfb9_c60a_4001_a11f_4b4eb6d89926.slice/crio-2f1642b55c7a859ac8d78963b44c55e0d7d074f0bb56f50970f3c98a32492245 WatchSource:0}: Error finding container 2f1642b55c7a859ac8d78963b44c55e0d7d074f0bb56f50970f3c98a32492245: Status 404 returned error can't find the container with id 2f1642b55c7a859ac8d78963b44c55e0d7d074f0bb56f50970f3c98a32492245 Apr 23 18:22:44.916258 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.916231 2578 generic.go:358] "Generic (PLEG): container finished" podID="a0b542d7-07d7-4bf5-abc6-79360cecf5d7" containerID="e15e5804aa959dec27616e8d9351dd006da4c708ab3e899967dfdcbc92c66e29" exitCode=2 Apr 23 18:22:44.916391 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.916312 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" event={"ID":"a0b542d7-07d7-4bf5-abc6-79360cecf5d7","Type":"ContainerDied","Data":"e15e5804aa959dec27616e8d9351dd006da4c708ab3e899967dfdcbc92c66e29"} Apr 23 18:22:44.917463 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:44.917439 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" event={"ID":"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926","Type":"ContainerStarted","Data":"2f1642b55c7a859ac8d78963b44c55e0d7d074f0bb56f50970f3c98a32492245"} Apr 23 18:22:45.922639 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:45.922595 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" event={"ID":"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926","Type":"ContainerStarted","Data":"a7e8c5e53f19bc0dc662796a78e566e8f6057738b886838882bc42f92accf8c3"} Apr 23 18:22:47.105248 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.105224 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" Apr 23 18:22:47.132308 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.132275 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g26x7\" (UniqueName: \"kubernetes.io/projected/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-kube-api-access-g26x7\") pod \"a0b542d7-07d7-4bf5-abc6-79360cecf5d7\" (UID: \"a0b542d7-07d7-4bf5-abc6-79360cecf5d7\") " Apr 23 18:22:47.132494 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.132324 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-proxy-tls\") pod \"a0b542d7-07d7-4bf5-abc6-79360cecf5d7\" (UID: \"a0b542d7-07d7-4bf5-abc6-79360cecf5d7\") " Apr 23 18:22:47.132494 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.132438 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"a0b542d7-07d7-4bf5-abc6-79360cecf5d7\" (UID: \"a0b542d7-07d7-4bf5-abc6-79360cecf5d7\") " Apr 23 18:22:47.132494 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.132480 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-kserve-provision-location\") pod \"a0b542d7-07d7-4bf5-abc6-79360cecf5d7\" (UID: \"a0b542d7-07d7-4bf5-abc6-79360cecf5d7\") " Apr 23 18:22:47.132889 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.132842 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-isvc-paddle-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-kube-rbac-proxy-sar-config") pod "a0b542d7-07d7-4bf5-abc6-79360cecf5d7" (UID: "a0b542d7-07d7-4bf5-abc6-79360cecf5d7"). InnerVolumeSpecName "isvc-paddle-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:22:47.134686 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.134659 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a0b542d7-07d7-4bf5-abc6-79360cecf5d7" (UID: "a0b542d7-07d7-4bf5-abc6-79360cecf5d7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:22:47.134792 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.134683 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-kube-api-access-g26x7" (OuterVolumeSpecName: "kube-api-access-g26x7") pod "a0b542d7-07d7-4bf5-abc6-79360cecf5d7" (UID: "a0b542d7-07d7-4bf5-abc6-79360cecf5d7"). InnerVolumeSpecName "kube-api-access-g26x7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:22:47.140948 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.140918 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a0b542d7-07d7-4bf5-abc6-79360cecf5d7" (UID: "a0b542d7-07d7-4bf5-abc6-79360cecf5d7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:22:47.233752 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.233668 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g26x7\" (UniqueName: \"kubernetes.io/projected/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-kube-api-access-g26x7\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:22:47.233752 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.233697 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-proxy-tls\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:22:47.233752 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.233710 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-isvc-paddle-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:22:47.233752 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.233720 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0b542d7-07d7-4bf5-abc6-79360cecf5d7-kserve-provision-location\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:22:47.931129 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.931092 2578 generic.go:358] "Generic (PLEG): container finished" podID="a0b542d7-07d7-4bf5-abc6-79360cecf5d7" containerID="a475289fac44be40284ffeaf219b8318fe76f7df110f44efe84e49ba1b11f1a9" exitCode=0 Apr 23 18:22:47.931313 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.931149 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" event={"ID":"a0b542d7-07d7-4bf5-abc6-79360cecf5d7","Type":"ContainerDied","Data":"a475289fac44be40284ffeaf219b8318fe76f7df110f44efe84e49ba1b11f1a9"} Apr 23 18:22:47.931313 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.931178 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" event={"ID":"a0b542d7-07d7-4bf5-abc6-79360cecf5d7","Type":"ContainerDied","Data":"5a31d7fd89a79be9eaf5c36e8f1968398bfde19b43e0710666f66e8243c23e33"} Apr 23 18:22:47.931313 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.931193 2578 scope.go:117] "RemoveContainer" containerID="e15e5804aa959dec27616e8d9351dd006da4c708ab3e899967dfdcbc92c66e29" Apr 23 18:22:47.931313 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.931196 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5" Apr 23 18:22:47.939704 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.939668 2578 scope.go:117] "RemoveContainer" containerID="a475289fac44be40284ffeaf219b8318fe76f7df110f44efe84e49ba1b11f1a9" Apr 23 18:22:47.946998 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.946978 2578 scope.go:117] "RemoveContainer" containerID="d2a73a4e787a5c14211f2103b29acdcfe0d9fd30afec66d92da51036fde2ad62" Apr 23 18:22:47.953279 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.953255 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5"] Apr 23 18:22:47.954709 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.954680 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-7fcj5"] Apr 23 18:22:47.955123 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.955103 2578 scope.go:117] "RemoveContainer" containerID="e15e5804aa959dec27616e8d9351dd006da4c708ab3e899967dfdcbc92c66e29" Apr 23 18:22:47.955404 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:22:47.955377 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e15e5804aa959dec27616e8d9351dd006da4c708ab3e899967dfdcbc92c66e29\": container with ID starting with e15e5804aa959dec27616e8d9351dd006da4c708ab3e899967dfdcbc92c66e29 not found: ID does not exist" containerID="e15e5804aa959dec27616e8d9351dd006da4c708ab3e899967dfdcbc92c66e29" Apr 23 18:22:47.955471 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.955414 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e15e5804aa959dec27616e8d9351dd006da4c708ab3e899967dfdcbc92c66e29"} err="failed to get container status \"e15e5804aa959dec27616e8d9351dd006da4c708ab3e899967dfdcbc92c66e29\": rpc error: code = NotFound desc = could not find container \"e15e5804aa959dec27616e8d9351dd006da4c708ab3e899967dfdcbc92c66e29\": container with ID starting with e15e5804aa959dec27616e8d9351dd006da4c708ab3e899967dfdcbc92c66e29 not found: ID does not exist" Apr 23 18:22:47.955471 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.955434 2578 scope.go:117] "RemoveContainer" containerID="a475289fac44be40284ffeaf219b8318fe76f7df110f44efe84e49ba1b11f1a9" Apr 23 18:22:47.955717 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:22:47.955693 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a475289fac44be40284ffeaf219b8318fe76f7df110f44efe84e49ba1b11f1a9\": container with ID starting with a475289fac44be40284ffeaf219b8318fe76f7df110f44efe84e49ba1b11f1a9 not found: ID does not exist" containerID="a475289fac44be40284ffeaf219b8318fe76f7df110f44efe84e49ba1b11f1a9" Apr 23 18:22:47.955786 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.955727 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a475289fac44be40284ffeaf219b8318fe76f7df110f44efe84e49ba1b11f1a9"} err="failed to get container status \"a475289fac44be40284ffeaf219b8318fe76f7df110f44efe84e49ba1b11f1a9\": rpc error: code = NotFound desc = could not find container \"a475289fac44be40284ffeaf219b8318fe76f7df110f44efe84e49ba1b11f1a9\": container with ID starting with a475289fac44be40284ffeaf219b8318fe76f7df110f44efe84e49ba1b11f1a9 not found: ID does not exist" Apr 23 18:22:47.955786 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.955750 2578 scope.go:117] "RemoveContainer" containerID="d2a73a4e787a5c14211f2103b29acdcfe0d9fd30afec66d92da51036fde2ad62" Apr 23 18:22:47.955970 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:22:47.955955 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2a73a4e787a5c14211f2103b29acdcfe0d9fd30afec66d92da51036fde2ad62\": container with ID starting with d2a73a4e787a5c14211f2103b29acdcfe0d9fd30afec66d92da51036fde2ad62 not found: ID does not exist" containerID="d2a73a4e787a5c14211f2103b29acdcfe0d9fd30afec66d92da51036fde2ad62" Apr 23 18:22:47.956028 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:47.955977 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2a73a4e787a5c14211f2103b29acdcfe0d9fd30afec66d92da51036fde2ad62"} err="failed to get container status \"d2a73a4e787a5c14211f2103b29acdcfe0d9fd30afec66d92da51036fde2ad62\": rpc error: code = NotFound desc = could not find container \"d2a73a4e787a5c14211f2103b29acdcfe0d9fd30afec66d92da51036fde2ad62\": container with ID starting with d2a73a4e787a5c14211f2103b29acdcfe0d9fd30afec66d92da51036fde2ad62 not found: ID does not exist" Apr 23 18:22:48.357677 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:48.357639 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0b542d7-07d7-4bf5-abc6-79360cecf5d7" path="/var/lib/kubelet/pods/a0b542d7-07d7-4bf5-abc6-79360cecf5d7/volumes" Apr 23 18:22:49.940067 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:49.940030 2578 generic.go:358] "Generic (PLEG): container finished" podID="a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" containerID="a7e8c5e53f19bc0dc662796a78e566e8f6057738b886838882bc42f92accf8c3" exitCode=0 Apr 23 18:22:49.940445 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:49.940101 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" event={"ID":"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926","Type":"ContainerDied","Data":"a7e8c5e53f19bc0dc662796a78e566e8f6057738b886838882bc42f92accf8c3"} Apr 23 18:22:50.944886 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:50.944848 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" event={"ID":"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926","Type":"ContainerStarted","Data":"5e3afeb9e001b0cbd2e41f822ba9f02f1cbbcb804b315303b9717cf8fcaceced"} Apr 23 18:22:50.944886 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:50.944893 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" event={"ID":"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926","Type":"ContainerStarted","Data":"2489fc9d92642e2d7b5ba897f8eb967c3eba451385efc74eb0fe2bffc830bff9"} Apr 23 18:22:50.945297 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:50.945091 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" Apr 23 18:22:50.963067 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:50.963018 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" podStartSLOduration=6.963001644 podStartE2EDuration="6.963001644s" podCreationTimestamp="2026-04-23 18:22:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:22:50.961427949 +0000 UTC m=+1455.195169865" watchObservedRunningTime="2026-04-23 18:22:50.963001644 +0000 UTC m=+1455.196743561" Apr 23 18:22:51.948578 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:51.948545 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" Apr 23 18:22:51.949866 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:51.949840 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" podUID="a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:22:52.951907 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:52.951865 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" podUID="a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:22:57.956270 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:57.956234 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" Apr 23 18:22:57.956752 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:22:57.956725 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" podUID="a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:23:07.957288 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:07.957244 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" podUID="a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:23:17.956765 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:17.956729 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" podUID="a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:23:27.956708 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:27.956667 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" podUID="a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:23:36.317881 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:36.317853 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:23:36.321244 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:36.321223 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:23:36.324750 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:36.324730 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:23:36.333235 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:36.333214 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:23:37.958094 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:37.958063 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" Apr 23 18:23:45.874203 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:45.874164 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb"] Apr 23 18:23:45.874680 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:45.874651 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" podUID="a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" containerName="kserve-container" containerID="cri-o://2489fc9d92642e2d7b5ba897f8eb967c3eba451385efc74eb0fe2bffc830bff9" gracePeriod=30 Apr 23 18:23:45.874768 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:45.874710 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" podUID="a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" containerName="kube-rbac-proxy" containerID="cri-o://5e3afeb9e001b0cbd2e41f822ba9f02f1cbbcb804b315303b9717cf8fcaceced" gracePeriod=30 Apr 23 18:23:46.007890 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.007856 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n"] Apr 23 18:23:46.008224 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.008207 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0b542d7-07d7-4bf5-abc6-79360cecf5d7" containerName="storage-initializer" Apr 23 18:23:46.008224 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.008222 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b542d7-07d7-4bf5-abc6-79360cecf5d7" containerName="storage-initializer" Apr 23 18:23:46.008403 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.008241 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0b542d7-07d7-4bf5-abc6-79360cecf5d7" containerName="kube-rbac-proxy" Apr 23 18:23:46.008403 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.008249 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b542d7-07d7-4bf5-abc6-79360cecf5d7" containerName="kube-rbac-proxy" Apr 23 18:23:46.008403 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.008259 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0b542d7-07d7-4bf5-abc6-79360cecf5d7" containerName="kserve-container" Apr 23 18:23:46.008403 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.008268 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b542d7-07d7-4bf5-abc6-79360cecf5d7" containerName="kserve-container" Apr 23 18:23:46.008403 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.008350 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a0b542d7-07d7-4bf5-abc6-79360cecf5d7" containerName="kube-rbac-proxy" Apr 23 18:23:46.008403 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.008362 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a0b542d7-07d7-4bf5-abc6-79360cecf5d7" containerName="kserve-container" Apr 23 18:23:46.011493 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.011473 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" Apr 23 18:23:46.013250 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.013228 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-predictor-serving-cert\"" Apr 23 18:23:46.013582 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.013565 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 23 18:23:46.020959 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.020914 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n"] Apr 23 18:23:46.119186 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.119152 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/049c816d-da7b-4ea7-9a10-6f5563883abc-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n\" (UID: \"049c816d-da7b-4ea7-9a10-6f5563883abc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" Apr 23 18:23:46.119376 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.119208 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/049c816d-da7b-4ea7-9a10-6f5563883abc-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n\" (UID: \"049c816d-da7b-4ea7-9a10-6f5563883abc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" Apr 23 18:23:46.119376 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.119317 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/049c816d-da7b-4ea7-9a10-6f5563883abc-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n\" (UID: \"049c816d-da7b-4ea7-9a10-6f5563883abc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" Apr 23 18:23:46.119497 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.119376 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvvw9\" (UniqueName: \"kubernetes.io/projected/049c816d-da7b-4ea7-9a10-6f5563883abc-kube-api-access-zvvw9\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n\" (UID: \"049c816d-da7b-4ea7-9a10-6f5563883abc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" Apr 23 18:23:46.133205 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.133131 2578 generic.go:358] "Generic (PLEG): container finished" podID="a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" containerID="5e3afeb9e001b0cbd2e41f822ba9f02f1cbbcb804b315303b9717cf8fcaceced" exitCode=2 Apr 23 18:23:46.133328 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.133209 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" event={"ID":"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926","Type":"ContainerDied","Data":"5e3afeb9e001b0cbd2e41f822ba9f02f1cbbcb804b315303b9717cf8fcaceced"} Apr 23 18:23:46.220471 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.220434 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/049c816d-da7b-4ea7-9a10-6f5563883abc-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n\" (UID: \"049c816d-da7b-4ea7-9a10-6f5563883abc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" Apr 23 18:23:46.220471 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.220485 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/049c816d-da7b-4ea7-9a10-6f5563883abc-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n\" (UID: \"049c816d-da7b-4ea7-9a10-6f5563883abc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" Apr 23 18:23:46.220763 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.220566 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/049c816d-da7b-4ea7-9a10-6f5563883abc-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n\" (UID: \"049c816d-da7b-4ea7-9a10-6f5563883abc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" Apr 23 18:23:46.220763 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.220623 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvvw9\" (UniqueName: \"kubernetes.io/projected/049c816d-da7b-4ea7-9a10-6f5563883abc-kube-api-access-zvvw9\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n\" (UID: \"049c816d-da7b-4ea7-9a10-6f5563883abc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" Apr 23 18:23:46.220987 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.220966 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/049c816d-da7b-4ea7-9a10-6f5563883abc-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n\" (UID: \"049c816d-da7b-4ea7-9a10-6f5563883abc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" Apr 23 18:23:46.221219 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.221202 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/049c816d-da7b-4ea7-9a10-6f5563883abc-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n\" (UID: \"049c816d-da7b-4ea7-9a10-6f5563883abc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" Apr 23 18:23:46.223221 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.223203 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/049c816d-da7b-4ea7-9a10-6f5563883abc-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n\" (UID: \"049c816d-da7b-4ea7-9a10-6f5563883abc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" Apr 23 18:23:46.228473 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.228450 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvvw9\" (UniqueName: \"kubernetes.io/projected/049c816d-da7b-4ea7-9a10-6f5563883abc-kube-api-access-zvvw9\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n\" (UID: \"049c816d-da7b-4ea7-9a10-6f5563883abc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" Apr 23 18:23:46.323753 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.323718 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" Apr 23 18:23:46.451793 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:46.451759 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n"] Apr 23 18:23:46.454321 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:23:46.454292 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod049c816d_da7b_4ea7_9a10_6f5563883abc.slice/crio-012e31ca3c9f0a502a21bc33cfd0d1c7d1c0ac008e72ce5ece7e3f940512d792 WatchSource:0}: Error finding container 012e31ca3c9f0a502a21bc33cfd0d1c7d1c0ac008e72ce5ece7e3f940512d792: Status 404 returned error can't find the container with id 012e31ca3c9f0a502a21bc33cfd0d1c7d1c0ac008e72ce5ece7e3f940512d792 Apr 23 18:23:47.138452 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:47.138412 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" event={"ID":"049c816d-da7b-4ea7-9a10-6f5563883abc","Type":"ContainerStarted","Data":"397107c0c14022bd21e69294c0ccdba37e1a83b14ed8096ee0dda3c3740477a3"} Apr 23 18:23:47.138452 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:47.138450 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" event={"ID":"049c816d-da7b-4ea7-9a10-6f5563883abc","Type":"ContainerStarted","Data":"012e31ca3c9f0a502a21bc33cfd0d1c7d1c0ac008e72ce5ece7e3f940512d792"} Apr 23 18:23:47.952228 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:47.952180 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" podUID="a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 23 18:23:47.957188 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:47.957161 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" podUID="a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 18:23:48.623371 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:48.623349 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" Apr 23 18:23:48.743910 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:48.743832 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926\" (UID: \"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926\") " Apr 23 18:23:48.743910 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:48.743901 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-proxy-tls\") pod \"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926\" (UID: \"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926\") " Apr 23 18:23:48.744141 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:48.743939 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk4wd\" (UniqueName: \"kubernetes.io/projected/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-kube-api-access-jk4wd\") pod \"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926\" (UID: \"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926\") " Apr 23 18:23:48.744141 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:48.743978 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-kserve-provision-location\") pod \"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926\" (UID: \"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926\") " Apr 23 18:23:48.744254 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:48.744183 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-isvc-paddle-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-runtime-kube-rbac-proxy-sar-config") pod "a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" (UID: "a8c5dfb9-c60a-4001-a11f-4b4eb6d89926"). InnerVolumeSpecName "isvc-paddle-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:23:48.746318 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:48.746292 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" (UID: "a8c5dfb9-c60a-4001-a11f-4b4eb6d89926"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:23:48.746318 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:48.746302 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-kube-api-access-jk4wd" (OuterVolumeSpecName: "kube-api-access-jk4wd") pod "a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" (UID: "a8c5dfb9-c60a-4001-a11f-4b4eb6d89926"). InnerVolumeSpecName "kube-api-access-jk4wd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:23:48.753418 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:48.753387 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" (UID: "a8c5dfb9-c60a-4001-a11f-4b4eb6d89926"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:23:48.844855 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:48.844820 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jk4wd\" (UniqueName: \"kubernetes.io/projected/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-kube-api-access-jk4wd\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:23:48.844855 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:48.844849 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-kserve-provision-location\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:23:48.844855 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:48.844859 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:23:48.845071 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:48.844872 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926-proxy-tls\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:23:49.146259 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:49.146222 2578 generic.go:358] "Generic (PLEG): container finished" podID="a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" containerID="2489fc9d92642e2d7b5ba897f8eb967c3eba451385efc74eb0fe2bffc830bff9" exitCode=0 Apr 23 18:23:49.146434 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:49.146303 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" Apr 23 18:23:49.146434 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:49.146302 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" event={"ID":"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926","Type":"ContainerDied","Data":"2489fc9d92642e2d7b5ba897f8eb967c3eba451385efc74eb0fe2bffc830bff9"} Apr 23 18:23:49.146434 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:49.146351 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb" event={"ID":"a8c5dfb9-c60a-4001-a11f-4b4eb6d89926","Type":"ContainerDied","Data":"2f1642b55c7a859ac8d78963b44c55e0d7d074f0bb56f50970f3c98a32492245"} Apr 23 18:23:49.146434 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:49.146377 2578 scope.go:117] "RemoveContainer" containerID="5e3afeb9e001b0cbd2e41f822ba9f02f1cbbcb804b315303b9717cf8fcaceced" Apr 23 18:23:49.154066 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:49.154043 2578 scope.go:117] "RemoveContainer" containerID="2489fc9d92642e2d7b5ba897f8eb967c3eba451385efc74eb0fe2bffc830bff9" Apr 23 18:23:49.161191 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:49.161175 2578 scope.go:117] "RemoveContainer" containerID="a7e8c5e53f19bc0dc662796a78e566e8f6057738b886838882bc42f92accf8c3" Apr 23 18:23:49.168417 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:49.168396 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb"] Apr 23 18:23:49.168489 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:49.168462 2578 scope.go:117] "RemoveContainer" containerID="5e3afeb9e001b0cbd2e41f822ba9f02f1cbbcb804b315303b9717cf8fcaceced" Apr 23 18:23:49.169029 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:23:49.168856 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e3afeb9e001b0cbd2e41f822ba9f02f1cbbcb804b315303b9717cf8fcaceced\": container with ID starting with 5e3afeb9e001b0cbd2e41f822ba9f02f1cbbcb804b315303b9717cf8fcaceced not found: ID does not exist" containerID="5e3afeb9e001b0cbd2e41f822ba9f02f1cbbcb804b315303b9717cf8fcaceced" Apr 23 18:23:49.169029 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:49.168894 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3afeb9e001b0cbd2e41f822ba9f02f1cbbcb804b315303b9717cf8fcaceced"} err="failed to get container status \"5e3afeb9e001b0cbd2e41f822ba9f02f1cbbcb804b315303b9717cf8fcaceced\": rpc error: code = NotFound desc = could not find container \"5e3afeb9e001b0cbd2e41f822ba9f02f1cbbcb804b315303b9717cf8fcaceced\": container with ID starting with 5e3afeb9e001b0cbd2e41f822ba9f02f1cbbcb804b315303b9717cf8fcaceced not found: ID does not exist" Apr 23 18:23:49.169029 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:49.168922 2578 scope.go:117] "RemoveContainer" containerID="2489fc9d92642e2d7b5ba897f8eb967c3eba451385efc74eb0fe2bffc830bff9" Apr 23 18:23:49.169462 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:23:49.169439 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2489fc9d92642e2d7b5ba897f8eb967c3eba451385efc74eb0fe2bffc830bff9\": container with ID starting with 2489fc9d92642e2d7b5ba897f8eb967c3eba451385efc74eb0fe2bffc830bff9 not found: ID does not exist" containerID="2489fc9d92642e2d7b5ba897f8eb967c3eba451385efc74eb0fe2bffc830bff9" Apr 23 18:23:49.169653 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:49.169469 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2489fc9d92642e2d7b5ba897f8eb967c3eba451385efc74eb0fe2bffc830bff9"} err="failed to get container status \"2489fc9d92642e2d7b5ba897f8eb967c3eba451385efc74eb0fe2bffc830bff9\": rpc error: code = NotFound desc = could not find container \"2489fc9d92642e2d7b5ba897f8eb967c3eba451385efc74eb0fe2bffc830bff9\": container with ID starting with 2489fc9d92642e2d7b5ba897f8eb967c3eba451385efc74eb0fe2bffc830bff9 not found: ID does not exist" Apr 23 18:23:49.169653 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:49.169499 2578 scope.go:117] "RemoveContainer" containerID="a7e8c5e53f19bc0dc662796a78e566e8f6057738b886838882bc42f92accf8c3" Apr 23 18:23:49.169804 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:23:49.169770 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7e8c5e53f19bc0dc662796a78e566e8f6057738b886838882bc42f92accf8c3\": container with ID starting with a7e8c5e53f19bc0dc662796a78e566e8f6057738b886838882bc42f92accf8c3 not found: ID does not exist" containerID="a7e8c5e53f19bc0dc662796a78e566e8f6057738b886838882bc42f92accf8c3" Apr 23 18:23:49.169915 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:49.169809 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e8c5e53f19bc0dc662796a78e566e8f6057738b886838882bc42f92accf8c3"} err="failed to get container status \"a7e8c5e53f19bc0dc662796a78e566e8f6057738b886838882bc42f92accf8c3\": rpc error: code = NotFound desc = could not find container \"a7e8c5e53f19bc0dc662796a78e566e8f6057738b886838882bc42f92accf8c3\": container with ID starting with a7e8c5e53f19bc0dc662796a78e566e8f6057738b886838882bc42f92accf8c3 not found: ID does not exist" Apr 23 18:23:49.170990 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:49.170968 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-qzbtb"] Apr 23 18:23:50.357769 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:50.357735 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" path="/var/lib/kubelet/pods/a8c5dfb9-c60a-4001-a11f-4b4eb6d89926/volumes" Apr 23 18:23:51.156382 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:51.156350 2578 generic.go:358] "Generic (PLEG): container finished" podID="049c816d-da7b-4ea7-9a10-6f5563883abc" containerID="397107c0c14022bd21e69294c0ccdba37e1a83b14ed8096ee0dda3c3740477a3" exitCode=0 Apr 23 18:23:51.156518 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:51.156422 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" event={"ID":"049c816d-da7b-4ea7-9a10-6f5563883abc","Type":"ContainerDied","Data":"397107c0c14022bd21e69294c0ccdba37e1a83b14ed8096ee0dda3c3740477a3"} Apr 23 18:23:52.161983 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:52.161947 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" event={"ID":"049c816d-da7b-4ea7-9a10-6f5563883abc","Type":"ContainerStarted","Data":"6244aa72a3c1e4618d5a99be43eab872d04f04afe85251c14f345f0cfb824d58"} Apr 23 18:23:52.162382 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:52.161990 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" event={"ID":"049c816d-da7b-4ea7-9a10-6f5563883abc","Type":"ContainerStarted","Data":"c5aae95d5a625b0e4c474e7b07d3acaea317ff7e68bbc6fd47ae8ed2746dc69d"} Apr 23 18:23:52.162382 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:52.162312 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" Apr 23 18:23:52.182619 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:52.182576 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" podStartSLOduration=7.182563223 podStartE2EDuration="7.182563223s" podCreationTimestamp="2026-04-23 18:23:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:23:52.180839951 +0000 UTC m=+1516.414581867" watchObservedRunningTime="2026-04-23 18:23:52.182563223 +0000 UTC m=+1516.416305135" Apr 23 18:23:53.165353 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:53.165318 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" Apr 23 18:23:53.166498 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:53.166473 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" podUID="049c816d-da7b-4ea7-9a10-6f5563883abc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 18:23:54.168583 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:54.168546 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" podUID="049c816d-da7b-4ea7-9a10-6f5563883abc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 18:23:59.172896 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:59.172863 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" Apr 23 18:23:59.173478 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:23:59.173446 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" podUID="049c816d-da7b-4ea7-9a10-6f5563883abc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 18:24:09.173563 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:09.173497 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" podUID="049c816d-da7b-4ea7-9a10-6f5563883abc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 18:24:19.174093 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:19.174051 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" podUID="049c816d-da7b-4ea7-9a10-6f5563883abc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 18:24:29.174089 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:29.174051 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" podUID="049c816d-da7b-4ea7-9a10-6f5563883abc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 18:24:39.174866 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:39.174834 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" Apr 23 18:24:47.672723 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:47.672688 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n"] Apr 23 18:24:47.673178 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:47.672975 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" podUID="049c816d-da7b-4ea7-9a10-6f5563883abc" containerName="kserve-container" containerID="cri-o://c5aae95d5a625b0e4c474e7b07d3acaea317ff7e68bbc6fd47ae8ed2746dc69d" gracePeriod=30 Apr 23 18:24:47.673178 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:47.673031 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" podUID="049c816d-da7b-4ea7-9a10-6f5563883abc" containerName="kube-rbac-proxy" containerID="cri-o://6244aa72a3c1e4618d5a99be43eab872d04f04afe85251c14f345f0cfb824d58" gracePeriod=30 Apr 23 18:24:47.774330 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:47.774298 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9"] Apr 23 18:24:47.774688 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:47.774675 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" containerName="kube-rbac-proxy" Apr 23 18:24:47.774744 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:47.774690 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" containerName="kube-rbac-proxy" Apr 23 18:24:47.774744 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:47.774706 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" containerName="kserve-container" Apr 23 18:24:47.774744 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:47.774713 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" containerName="kserve-container" Apr 23 18:24:47.774744 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:47.774725 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" containerName="storage-initializer" Apr 23 18:24:47.774744 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:47.774731 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" containerName="storage-initializer" Apr 23 18:24:47.774888 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:47.774784 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" containerName="kube-rbac-proxy" Apr 23 18:24:47.774888 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:47.774795 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8c5dfb9-c60a-4001-a11f-4b4eb6d89926" containerName="kserve-container" Apr 23 18:24:47.777982 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:47.777957 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" Apr 23 18:24:47.779930 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:47.779907 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-predictor-serving-cert\"" Apr 23 18:24:47.780041 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:47.779933 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-kube-rbac-proxy-sar-config\"" Apr 23 18:24:47.792220 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:47.792196 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9"] Apr 23 18:24:47.934742 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:47.934641 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ca8ff6e1-44aa-47f6-8239-1e961c304403-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-5dzf9\" (UID: \"ca8ff6e1-44aa-47f6-8239-1e961c304403\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" Apr 23 18:24:47.934742 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:47.934709 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca8ff6e1-44aa-47f6-8239-1e961c304403-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-5dzf9\" (UID: \"ca8ff6e1-44aa-47f6-8239-1e961c304403\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" Apr 23 18:24:47.934971 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:47.934751 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca8ff6e1-44aa-47f6-8239-1e961c304403-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-5dzf9\" (UID: \"ca8ff6e1-44aa-47f6-8239-1e961c304403\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" Apr 23 18:24:47.934971 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:47.934793 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzvsw\" (UniqueName: \"kubernetes.io/projected/ca8ff6e1-44aa-47f6-8239-1e961c304403-kube-api-access-dzvsw\") pod \"isvc-pmml-predictor-8bb578669-5dzf9\" (UID: \"ca8ff6e1-44aa-47f6-8239-1e961c304403\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" Apr 23 18:24:48.035187 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:48.035147 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ca8ff6e1-44aa-47f6-8239-1e961c304403-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-5dzf9\" (UID: \"ca8ff6e1-44aa-47f6-8239-1e961c304403\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" Apr 23 18:24:48.035377 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:48.035201 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca8ff6e1-44aa-47f6-8239-1e961c304403-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-5dzf9\" (UID: \"ca8ff6e1-44aa-47f6-8239-1e961c304403\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" Apr 23 18:24:48.035377 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:48.035242 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca8ff6e1-44aa-47f6-8239-1e961c304403-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-5dzf9\" (UID: \"ca8ff6e1-44aa-47f6-8239-1e961c304403\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" Apr 23 18:24:48.035377 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:48.035281 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzvsw\" (UniqueName: \"kubernetes.io/projected/ca8ff6e1-44aa-47f6-8239-1e961c304403-kube-api-access-dzvsw\") pod \"isvc-pmml-predictor-8bb578669-5dzf9\" (UID: \"ca8ff6e1-44aa-47f6-8239-1e961c304403\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" Apr 23 18:24:48.035377 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:24:48.035315 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-pmml-predictor-serving-cert: secret "isvc-pmml-predictor-serving-cert" not found Apr 23 18:24:48.035613 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:24:48.035389 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca8ff6e1-44aa-47f6-8239-1e961c304403-proxy-tls podName:ca8ff6e1-44aa-47f6-8239-1e961c304403 nodeName:}" failed. No retries permitted until 2026-04-23 18:24:48.535368511 +0000 UTC m=+1572.769110419 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ca8ff6e1-44aa-47f6-8239-1e961c304403-proxy-tls") pod "isvc-pmml-predictor-8bb578669-5dzf9" (UID: "ca8ff6e1-44aa-47f6-8239-1e961c304403") : secret "isvc-pmml-predictor-serving-cert" not found Apr 23 18:24:48.035680 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:48.035662 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca8ff6e1-44aa-47f6-8239-1e961c304403-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-5dzf9\" (UID: \"ca8ff6e1-44aa-47f6-8239-1e961c304403\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" Apr 23 18:24:48.035872 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:48.035854 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ca8ff6e1-44aa-47f6-8239-1e961c304403-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-5dzf9\" (UID: \"ca8ff6e1-44aa-47f6-8239-1e961c304403\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" Apr 23 18:24:48.054162 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:48.054132 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzvsw\" (UniqueName: \"kubernetes.io/projected/ca8ff6e1-44aa-47f6-8239-1e961c304403-kube-api-access-dzvsw\") pod \"isvc-pmml-predictor-8bb578669-5dzf9\" (UID: \"ca8ff6e1-44aa-47f6-8239-1e961c304403\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" Apr 23 18:24:48.344606 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:48.344570 2578 generic.go:358] "Generic (PLEG): container finished" podID="049c816d-da7b-4ea7-9a10-6f5563883abc" containerID="6244aa72a3c1e4618d5a99be43eab872d04f04afe85251c14f345f0cfb824d58" exitCode=2 Apr 23 18:24:48.344777 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:48.344644 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" event={"ID":"049c816d-da7b-4ea7-9a10-6f5563883abc","Type":"ContainerDied","Data":"6244aa72a3c1e4618d5a99be43eab872d04f04afe85251c14f345f0cfb824d58"} Apr 23 18:24:48.539722 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:48.539687 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca8ff6e1-44aa-47f6-8239-1e961c304403-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-5dzf9\" (UID: \"ca8ff6e1-44aa-47f6-8239-1e961c304403\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" Apr 23 18:24:48.542000 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:48.541983 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca8ff6e1-44aa-47f6-8239-1e961c304403-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-5dzf9\" (UID: \"ca8ff6e1-44aa-47f6-8239-1e961c304403\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" Apr 23 18:24:48.688930 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:48.688851 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" Apr 23 18:24:48.811937 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:48.811907 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9"] Apr 23 18:24:48.814067 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:24:48.814033 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca8ff6e1_44aa_47f6_8239_1e961c304403.slice/crio-770fb19db432b9ba8fd93251f7cc1716f2252899618d8c084ee100a5872067de WatchSource:0}: Error finding container 770fb19db432b9ba8fd93251f7cc1716f2252899618d8c084ee100a5872067de: Status 404 returned error can't find the container with id 770fb19db432b9ba8fd93251f7cc1716f2252899618d8c084ee100a5872067de Apr 23 18:24:49.168789 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:49.168749 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" podUID="049c816d-da7b-4ea7-9a10-6f5563883abc" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.33:8643/healthz\": dial tcp 10.133.0.33:8643: connect: connection refused" Apr 23 18:24:49.174101 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:49.174064 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" podUID="049c816d-da7b-4ea7-9a10-6f5563883abc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 18:24:49.349637 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:49.349597 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" event={"ID":"ca8ff6e1-44aa-47f6-8239-1e961c304403","Type":"ContainerStarted","Data":"7d2d4704e42380971a999513d2377a7be1f2d28986718c10d7f47bcfe3bb6a65"} Apr 23 18:24:49.349637 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:49.349639 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" event={"ID":"ca8ff6e1-44aa-47f6-8239-1e961c304403","Type":"ContainerStarted","Data":"770fb19db432b9ba8fd93251f7cc1716f2252899618d8c084ee100a5872067de"} Apr 23 18:24:50.356973 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:50.356938 2578 generic.go:358] "Generic (PLEG): container finished" podID="049c816d-da7b-4ea7-9a10-6f5563883abc" containerID="c5aae95d5a625b0e4c474e7b07d3acaea317ff7e68bbc6fd47ae8ed2746dc69d" exitCode=0 Apr 23 18:24:50.359049 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:50.359021 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" event={"ID":"049c816d-da7b-4ea7-9a10-6f5563883abc","Type":"ContainerDied","Data":"c5aae95d5a625b0e4c474e7b07d3acaea317ff7e68bbc6fd47ae8ed2746dc69d"} Apr 23 18:24:50.407367 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:50.407343 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" Apr 23 18:24:50.452003 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:50.451972 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/049c816d-da7b-4ea7-9a10-6f5563883abc-proxy-tls\") pod \"049c816d-da7b-4ea7-9a10-6f5563883abc\" (UID: \"049c816d-da7b-4ea7-9a10-6f5563883abc\") " Apr 23 18:24:50.453936 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:50.453913 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/049c816d-da7b-4ea7-9a10-6f5563883abc-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "049c816d-da7b-4ea7-9a10-6f5563883abc" (UID: "049c816d-da7b-4ea7-9a10-6f5563883abc"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:24:50.553125 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:50.553101 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/049c816d-da7b-4ea7-9a10-6f5563883abc-kserve-provision-location\") pod \"049c816d-da7b-4ea7-9a10-6f5563883abc\" (UID: \"049c816d-da7b-4ea7-9a10-6f5563883abc\") " Apr 23 18:24:50.553291 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:50.553201 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvvw9\" (UniqueName: \"kubernetes.io/projected/049c816d-da7b-4ea7-9a10-6f5563883abc-kube-api-access-zvvw9\") pod \"049c816d-da7b-4ea7-9a10-6f5563883abc\" (UID: \"049c816d-da7b-4ea7-9a10-6f5563883abc\") " Apr 23 18:24:50.553291 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:50.553233 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/049c816d-da7b-4ea7-9a10-6f5563883abc-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"049c816d-da7b-4ea7-9a10-6f5563883abc\" (UID: \"049c816d-da7b-4ea7-9a10-6f5563883abc\") " Apr 23 18:24:50.553452 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:50.553434 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/049c816d-da7b-4ea7-9a10-6f5563883abc-proxy-tls\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:24:50.553597 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:50.553573 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/049c816d-da7b-4ea7-9a10-6f5563883abc-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config") pod "049c816d-da7b-4ea7-9a10-6f5563883abc" (UID: "049c816d-da7b-4ea7-9a10-6f5563883abc"). InnerVolumeSpecName "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:24:50.555270 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:50.555248 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/049c816d-da7b-4ea7-9a10-6f5563883abc-kube-api-access-zvvw9" (OuterVolumeSpecName: "kube-api-access-zvvw9") pod "049c816d-da7b-4ea7-9a10-6f5563883abc" (UID: "049c816d-da7b-4ea7-9a10-6f5563883abc"). InnerVolumeSpecName "kube-api-access-zvvw9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:24:50.562738 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:50.562715 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/049c816d-da7b-4ea7-9a10-6f5563883abc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "049c816d-da7b-4ea7-9a10-6f5563883abc" (UID: "049c816d-da7b-4ea7-9a10-6f5563883abc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:24:50.654295 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:50.654267 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zvvw9\" (UniqueName: \"kubernetes.io/projected/049c816d-da7b-4ea7-9a10-6f5563883abc-kube-api-access-zvvw9\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:24:50.654295 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:50.654292 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/049c816d-da7b-4ea7-9a10-6f5563883abc-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:24:50.654446 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:50.654304 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/049c816d-da7b-4ea7-9a10-6f5563883abc-kserve-provision-location\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:24:51.363029 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:51.362991 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" event={"ID":"049c816d-da7b-4ea7-9a10-6f5563883abc","Type":"ContainerDied","Data":"012e31ca3c9f0a502a21bc33cfd0d1c7d1c0ac008e72ce5ece7e3f940512d792"} Apr 23 18:24:51.363029 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:51.363020 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n" Apr 23 18:24:51.363029 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:51.363034 2578 scope.go:117] "RemoveContainer" containerID="6244aa72a3c1e4618d5a99be43eab872d04f04afe85251c14f345f0cfb824d58" Apr 23 18:24:51.371261 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:51.371241 2578 scope.go:117] "RemoveContainer" containerID="c5aae95d5a625b0e4c474e7b07d3acaea317ff7e68bbc6fd47ae8ed2746dc69d" Apr 23 18:24:51.378211 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:51.378191 2578 scope.go:117] "RemoveContainer" containerID="397107c0c14022bd21e69294c0ccdba37e1a83b14ed8096ee0dda3c3740477a3" Apr 23 18:24:51.384468 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:51.384443 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n"] Apr 23 18:24:51.388970 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:51.388948 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-6x57n"] Apr 23 18:24:52.357268 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:52.357234 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="049c816d-da7b-4ea7-9a10-6f5563883abc" path="/var/lib/kubelet/pods/049c816d-da7b-4ea7-9a10-6f5563883abc/volumes" Apr 23 18:24:53.371502 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:53.371470 2578 generic.go:358] "Generic (PLEG): container finished" podID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerID="7d2d4704e42380971a999513d2377a7be1f2d28986718c10d7f47bcfe3bb6a65" exitCode=0 Apr 23 18:24:53.372002 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:24:53.371558 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" event={"ID":"ca8ff6e1-44aa-47f6-8239-1e961c304403","Type":"ContainerDied","Data":"7d2d4704e42380971a999513d2377a7be1f2d28986718c10d7f47bcfe3bb6a65"} Apr 23 18:25:00.404949 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:25:00.404854 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" event={"ID":"ca8ff6e1-44aa-47f6-8239-1e961c304403","Type":"ContainerStarted","Data":"44896ce9ba9f1c89cc4006f86400c230fba50b5805c760e628e2ce1e6cc14dd6"} Apr 23 18:25:00.404949 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:25:00.404914 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" event={"ID":"ca8ff6e1-44aa-47f6-8239-1e961c304403","Type":"ContainerStarted","Data":"e1ba97cfa7f17950c7bdcdca2cbb6d89c83664baa32945ed787138a18b235689"} Apr 23 18:25:00.405882 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:25:00.405853 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" Apr 23 18:25:00.423353 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:25:00.423282 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" podStartSLOduration=6.709777729 podStartE2EDuration="13.423269978s" podCreationTimestamp="2026-04-23 18:24:47 +0000 UTC" firstStartedPulling="2026-04-23 18:24:53.372709549 +0000 UTC m=+1577.606451444" lastFinishedPulling="2026-04-23 18:25:00.086201787 +0000 UTC m=+1584.319943693" observedRunningTime="2026-04-23 18:25:00.422874212 +0000 UTC m=+1584.656616128" watchObservedRunningTime="2026-04-23 18:25:00.423269978 +0000 UTC m=+1584.657011894" Apr 23 18:25:01.408952 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:25:01.408910 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" Apr 23 18:25:01.410318 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:25:01.410291 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" podUID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 18:25:02.412523 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:25:02.412481 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" podUID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 18:25:07.418094 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:25:07.418067 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" Apr 23 18:25:07.418559 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:25:07.418518 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" podUID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 18:25:17.418648 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:25:17.418602 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" podUID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 18:25:27.419136 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:25:27.419056 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" podUID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 18:25:37.419523 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:25:37.419477 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" podUID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 18:25:47.418913 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:25:47.418863 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" podUID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 18:25:57.418901 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:25:57.418851 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" podUID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 18:26:07.419643 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:07.419586 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" podUID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 18:26:17.418903 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:17.418851 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" podUID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 18:26:25.354656 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:25.354624 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" Apr 23 18:26:28.867045 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:28.867007 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9"] Apr 23 18:26:28.867570 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:28.867322 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" podUID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerName="kserve-container" containerID="cri-o://e1ba97cfa7f17950c7bdcdca2cbb6d89c83664baa32945ed787138a18b235689" gracePeriod=30 Apr 23 18:26:28.867570 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:28.867347 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" podUID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerName="kube-rbac-proxy" containerID="cri-o://44896ce9ba9f1c89cc4006f86400c230fba50b5805c760e628e2ce1e6cc14dd6" gracePeriod=30 Apr 23 18:26:29.710800 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:29.710764 2578 generic.go:358] "Generic (PLEG): container finished" podID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerID="44896ce9ba9f1c89cc4006f86400c230fba50b5805c760e628e2ce1e6cc14dd6" exitCode=2 Apr 23 18:26:29.710999 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:29.710836 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" event={"ID":"ca8ff6e1-44aa-47f6-8239-1e961c304403","Type":"ContainerDied","Data":"44896ce9ba9f1c89cc4006f86400c230fba50b5805c760e628e2ce1e6cc14dd6"} Apr 23 18:26:32.413218 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.413172 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" podUID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.34:8643/healthz\": dial tcp 10.133.0.34:8643: connect: connection refused" Apr 23 18:26:32.716798 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.716773 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" Apr 23 18:26:32.722188 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.722152 2578 generic.go:358] "Generic (PLEG): container finished" podID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerID="e1ba97cfa7f17950c7bdcdca2cbb6d89c83664baa32945ed787138a18b235689" exitCode=0 Apr 23 18:26:32.722317 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.722228 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" Apr 23 18:26:32.722317 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.722224 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" event={"ID":"ca8ff6e1-44aa-47f6-8239-1e961c304403","Type":"ContainerDied","Data":"e1ba97cfa7f17950c7bdcdca2cbb6d89c83664baa32945ed787138a18b235689"} Apr 23 18:26:32.722399 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.722332 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9" event={"ID":"ca8ff6e1-44aa-47f6-8239-1e961c304403","Type":"ContainerDied","Data":"770fb19db432b9ba8fd93251f7cc1716f2252899618d8c084ee100a5872067de"} Apr 23 18:26:32.722399 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.722349 2578 scope.go:117] "RemoveContainer" containerID="44896ce9ba9f1c89cc4006f86400c230fba50b5805c760e628e2ce1e6cc14dd6" Apr 23 18:26:32.732718 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.732652 2578 scope.go:117] "RemoveContainer" containerID="e1ba97cfa7f17950c7bdcdca2cbb6d89c83664baa32945ed787138a18b235689" Apr 23 18:26:32.742559 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.742513 2578 scope.go:117] "RemoveContainer" containerID="7d2d4704e42380971a999513d2377a7be1f2d28986718c10d7f47bcfe3bb6a65" Apr 23 18:26:32.750560 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.750514 2578 scope.go:117] "RemoveContainer" containerID="44896ce9ba9f1c89cc4006f86400c230fba50b5805c760e628e2ce1e6cc14dd6" Apr 23 18:26:32.750881 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:26:32.750861 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44896ce9ba9f1c89cc4006f86400c230fba50b5805c760e628e2ce1e6cc14dd6\": container with ID starting with 44896ce9ba9f1c89cc4006f86400c230fba50b5805c760e628e2ce1e6cc14dd6 not found: ID does not exist" containerID="44896ce9ba9f1c89cc4006f86400c230fba50b5805c760e628e2ce1e6cc14dd6" Apr 23 18:26:32.750963 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.750899 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44896ce9ba9f1c89cc4006f86400c230fba50b5805c760e628e2ce1e6cc14dd6"} err="failed to get container status \"44896ce9ba9f1c89cc4006f86400c230fba50b5805c760e628e2ce1e6cc14dd6\": rpc error: code = NotFound desc = could not find container \"44896ce9ba9f1c89cc4006f86400c230fba50b5805c760e628e2ce1e6cc14dd6\": container with ID starting with 44896ce9ba9f1c89cc4006f86400c230fba50b5805c760e628e2ce1e6cc14dd6 not found: ID does not exist" Apr 23 18:26:32.750963 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.750921 2578 scope.go:117] "RemoveContainer" containerID="e1ba97cfa7f17950c7bdcdca2cbb6d89c83664baa32945ed787138a18b235689" Apr 23 18:26:32.751207 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:26:32.751177 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1ba97cfa7f17950c7bdcdca2cbb6d89c83664baa32945ed787138a18b235689\": container with ID starting with e1ba97cfa7f17950c7bdcdca2cbb6d89c83664baa32945ed787138a18b235689 not found: ID does not exist" containerID="e1ba97cfa7f17950c7bdcdca2cbb6d89c83664baa32945ed787138a18b235689" Apr 23 18:26:32.751277 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.751215 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1ba97cfa7f17950c7bdcdca2cbb6d89c83664baa32945ed787138a18b235689"} err="failed to get container status \"e1ba97cfa7f17950c7bdcdca2cbb6d89c83664baa32945ed787138a18b235689\": rpc error: code = NotFound desc = could not find container \"e1ba97cfa7f17950c7bdcdca2cbb6d89c83664baa32945ed787138a18b235689\": container with ID starting with e1ba97cfa7f17950c7bdcdca2cbb6d89c83664baa32945ed787138a18b235689 not found: ID does not exist" Apr 23 18:26:32.751277 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.751238 2578 scope.go:117] "RemoveContainer" containerID="7d2d4704e42380971a999513d2377a7be1f2d28986718c10d7f47bcfe3bb6a65" Apr 23 18:26:32.751508 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:26:32.751489 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d2d4704e42380971a999513d2377a7be1f2d28986718c10d7f47bcfe3bb6a65\": container with ID starting with 7d2d4704e42380971a999513d2377a7be1f2d28986718c10d7f47bcfe3bb6a65 not found: ID does not exist" containerID="7d2d4704e42380971a999513d2377a7be1f2d28986718c10d7f47bcfe3bb6a65" Apr 23 18:26:32.751571 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.751514 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d2d4704e42380971a999513d2377a7be1f2d28986718c10d7f47bcfe3bb6a65"} err="failed to get container status \"7d2d4704e42380971a999513d2377a7be1f2d28986718c10d7f47bcfe3bb6a65\": rpc error: code = NotFound desc = could not find container \"7d2d4704e42380971a999513d2377a7be1f2d28986718c10d7f47bcfe3bb6a65\": container with ID starting with 7d2d4704e42380971a999513d2377a7be1f2d28986718c10d7f47bcfe3bb6a65 not found: ID does not exist" Apr 23 18:26:32.803477 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.803439 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca8ff6e1-44aa-47f6-8239-1e961c304403-proxy-tls\") pod \"ca8ff6e1-44aa-47f6-8239-1e961c304403\" (UID: \"ca8ff6e1-44aa-47f6-8239-1e961c304403\") " Apr 23 18:26:32.803697 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.803499 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ca8ff6e1-44aa-47f6-8239-1e961c304403-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"ca8ff6e1-44aa-47f6-8239-1e961c304403\" (UID: \"ca8ff6e1-44aa-47f6-8239-1e961c304403\") " Apr 23 18:26:32.803697 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.803525 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca8ff6e1-44aa-47f6-8239-1e961c304403-kserve-provision-location\") pod \"ca8ff6e1-44aa-47f6-8239-1e961c304403\" (UID: \"ca8ff6e1-44aa-47f6-8239-1e961c304403\") " Apr 23 18:26:32.803697 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.803607 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzvsw\" (UniqueName: \"kubernetes.io/projected/ca8ff6e1-44aa-47f6-8239-1e961c304403-kube-api-access-dzvsw\") pod \"ca8ff6e1-44aa-47f6-8239-1e961c304403\" (UID: \"ca8ff6e1-44aa-47f6-8239-1e961c304403\") " Apr 23 18:26:32.803906 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.803880 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca8ff6e1-44aa-47f6-8239-1e961c304403-isvc-pmml-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-kube-rbac-proxy-sar-config") pod "ca8ff6e1-44aa-47f6-8239-1e961c304403" (UID: "ca8ff6e1-44aa-47f6-8239-1e961c304403"). InnerVolumeSpecName "isvc-pmml-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:26:32.803906 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.803890 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca8ff6e1-44aa-47f6-8239-1e961c304403-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ca8ff6e1-44aa-47f6-8239-1e961c304403" (UID: "ca8ff6e1-44aa-47f6-8239-1e961c304403"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:26:32.805740 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.805707 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca8ff6e1-44aa-47f6-8239-1e961c304403-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ca8ff6e1-44aa-47f6-8239-1e961c304403" (UID: "ca8ff6e1-44aa-47f6-8239-1e961c304403"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:26:32.805834 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.805802 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca8ff6e1-44aa-47f6-8239-1e961c304403-kube-api-access-dzvsw" (OuterVolumeSpecName: "kube-api-access-dzvsw") pod "ca8ff6e1-44aa-47f6-8239-1e961c304403" (UID: "ca8ff6e1-44aa-47f6-8239-1e961c304403"). InnerVolumeSpecName "kube-api-access-dzvsw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:26:32.904342 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.904307 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dzvsw\" (UniqueName: \"kubernetes.io/projected/ca8ff6e1-44aa-47f6-8239-1e961c304403-kube-api-access-dzvsw\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:26:32.904342 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.904338 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca8ff6e1-44aa-47f6-8239-1e961c304403-proxy-tls\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:26:32.904584 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.904353 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ca8ff6e1-44aa-47f6-8239-1e961c304403-isvc-pmml-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:26:32.904584 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:32.904368 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca8ff6e1-44aa-47f6-8239-1e961c304403-kserve-provision-location\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:26:33.044760 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:33.044723 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9"] Apr 23 18:26:33.049414 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:33.049384 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-5dzf9"] Apr 23 18:26:34.358458 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:26:34.358419 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca8ff6e1-44aa-47f6-8239-1e961c304403" path="/var/lib/kubelet/pods/ca8ff6e1-44aa-47f6-8239-1e961c304403/volumes" Apr 23 18:28:10.101769 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.101729 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh"] Apr 23 18:28:10.102202 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.102091 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="049c816d-da7b-4ea7-9a10-6f5563883abc" containerName="kube-rbac-proxy" Apr 23 18:28:10.102202 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.102104 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="049c816d-da7b-4ea7-9a10-6f5563883abc" containerName="kube-rbac-proxy" Apr 23 18:28:10.102202 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.102114 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerName="kserve-container" Apr 23 18:28:10.102202 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.102119 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerName="kserve-container" Apr 23 18:28:10.102202 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.102131 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerName="kube-rbac-proxy" Apr 23 18:28:10.102202 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.102136 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerName="kube-rbac-proxy" Apr 23 18:28:10.102202 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.102144 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerName="storage-initializer" Apr 23 18:28:10.102202 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.102149 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerName="storage-initializer" Apr 23 18:28:10.102202 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.102156 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="049c816d-da7b-4ea7-9a10-6f5563883abc" containerName="storage-initializer" Apr 23 18:28:10.102202 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.102161 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="049c816d-da7b-4ea7-9a10-6f5563883abc" containerName="storage-initializer" Apr 23 18:28:10.102202 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.102167 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="049c816d-da7b-4ea7-9a10-6f5563883abc" containerName="kserve-container" Apr 23 18:28:10.102202 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.102173 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="049c816d-da7b-4ea7-9a10-6f5563883abc" containerName="kserve-container" Apr 23 18:28:10.102583 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.102224 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="049c816d-da7b-4ea7-9a10-6f5563883abc" containerName="kube-rbac-proxy" Apr 23 18:28:10.102583 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.102235 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerName="kube-rbac-proxy" Apr 23 18:28:10.102583 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.102242 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca8ff6e1-44aa-47f6-8239-1e961c304403" containerName="kserve-container" Apr 23 18:28:10.102583 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.102249 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="049c816d-da7b-4ea7-9a10-6f5563883abc" containerName="kserve-container" Apr 23 18:28:10.105300 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.105276 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" Apr 23 18:28:10.107008 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.106982 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t4fg7\"" Apr 23 18:28:10.107279 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.107191 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-predictor-serving-cert\"" Apr 23 18:28:10.107512 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.107491 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 18:28:10.107652 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.107509 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 23 18:28:10.107652 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.107644 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:28:10.114449 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.114419 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh"] Apr 23 18:28:10.185822 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.185792 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srw67\" (UniqueName: \"kubernetes.io/projected/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-kube-api-access-srw67\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh\" (UID: \"1ac663fb-4900-4bbc-9deb-caf55bca6bdd\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" Apr 23 18:28:10.185992 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.185831 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh\" (UID: \"1ac663fb-4900-4bbc-9deb-caf55bca6bdd\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" Apr 23 18:28:10.185992 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.185878 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh\" (UID: \"1ac663fb-4900-4bbc-9deb-caf55bca6bdd\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" Apr 23 18:28:10.185992 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.185952 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh\" (UID: \"1ac663fb-4900-4bbc-9deb-caf55bca6bdd\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" Apr 23 18:28:10.287044 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.286999 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh\" (UID: \"1ac663fb-4900-4bbc-9deb-caf55bca6bdd\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" Apr 23 18:28:10.287044 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.287050 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh\" (UID: \"1ac663fb-4900-4bbc-9deb-caf55bca6bdd\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" Apr 23 18:28:10.287268 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.287101 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srw67\" (UniqueName: \"kubernetes.io/projected/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-kube-api-access-srw67\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh\" (UID: \"1ac663fb-4900-4bbc-9deb-caf55bca6bdd\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" Apr 23 18:28:10.287268 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.287123 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh\" (UID: \"1ac663fb-4900-4bbc-9deb-caf55bca6bdd\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" Apr 23 18:28:10.287457 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.287436 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh\" (UID: \"1ac663fb-4900-4bbc-9deb-caf55bca6bdd\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" Apr 23 18:28:10.287777 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.287755 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh\" (UID: \"1ac663fb-4900-4bbc-9deb-caf55bca6bdd\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" Apr 23 18:28:10.289668 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.289644 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh\" (UID: \"1ac663fb-4900-4bbc-9deb-caf55bca6bdd\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" Apr 23 18:28:10.295351 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.295325 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srw67\" (UniqueName: \"kubernetes.io/projected/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-kube-api-access-srw67\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh\" (UID: \"1ac663fb-4900-4bbc-9deb-caf55bca6bdd\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" Apr 23 18:28:10.417559 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.417457 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" Apr 23 18:28:10.544705 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.544676 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh"] Apr 23 18:28:10.547375 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:28:10.547345 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ac663fb_4900_4bbc_9deb_caf55bca6bdd.slice/crio-6c43a9b60e188ffa193ca9b918fd67fcf80883a0959a2b26fb762f74eade1a8b WatchSource:0}: Error finding container 6c43a9b60e188ffa193ca9b918fd67fcf80883a0959a2b26fb762f74eade1a8b: Status 404 returned error can't find the container with id 6c43a9b60e188ffa193ca9b918fd67fcf80883a0959a2b26fb762f74eade1a8b Apr 23 18:28:10.549315 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:10.549295 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:28:11.045554 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:11.045494 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" event={"ID":"1ac663fb-4900-4bbc-9deb-caf55bca6bdd","Type":"ContainerStarted","Data":"0ce655781572cf962403ec144ff173dee60e5af3f1ddc94c06fbc2bec69ef74f"} Apr 23 18:28:11.045554 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:11.045552 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" event={"ID":"1ac663fb-4900-4bbc-9deb-caf55bca6bdd","Type":"ContainerStarted","Data":"6c43a9b60e188ffa193ca9b918fd67fcf80883a0959a2b26fb762f74eade1a8b"} Apr 23 18:28:15.060488 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:15.060449 2578 generic.go:358] "Generic (PLEG): container finished" podID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerID="0ce655781572cf962403ec144ff173dee60e5af3f1ddc94c06fbc2bec69ef74f" exitCode=0 Apr 23 18:28:15.060892 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:15.060523 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" event={"ID":"1ac663fb-4900-4bbc-9deb-caf55bca6bdd","Type":"ContainerDied","Data":"0ce655781572cf962403ec144ff173dee60e5af3f1ddc94c06fbc2bec69ef74f"} Apr 23 18:28:16.066124 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:16.066086 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" event={"ID":"1ac663fb-4900-4bbc-9deb-caf55bca6bdd","Type":"ContainerStarted","Data":"69a9d66db0b8449f6bd33f8751a6f70fc258321b8e0c87459c7f9f6a056666fd"} Apr 23 18:28:16.066124 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:16.066129 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" event={"ID":"1ac663fb-4900-4bbc-9deb-caf55bca6bdd","Type":"ContainerStarted","Data":"cef9b52167e9284b3881c22b23c21e7e17b73d0ba32673ad96d25b8489295b52"} Apr 23 18:28:16.066580 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:16.066333 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" Apr 23 18:28:16.084043 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:16.083982 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" podStartSLOduration=6.083964122 podStartE2EDuration="6.083964122s" podCreationTimestamp="2026-04-23 18:28:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:28:16.08299219 +0000 UTC m=+1780.316734106" watchObservedRunningTime="2026-04-23 18:28:16.083964122 +0000 UTC m=+1780.317706050" Apr 23 18:28:17.069220 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:17.069189 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" Apr 23 18:28:17.070421 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:17.070392 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" podUID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:28:18.072459 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:18.072417 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" podUID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:28:23.076476 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:23.076433 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" Apr 23 18:28:23.077107 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:23.077076 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" podUID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:28:33.077045 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:33.076990 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" podUID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:28:36.346247 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:36.346207 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:28:36.350068 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:36.350042 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:28:36.352791 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:36.352767 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:28:36.357009 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:36.356979 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:28:43.076992 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:43.076945 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" podUID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:28:53.077801 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:28:53.077755 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" podUID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:29:03.077428 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:03.077384 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" podUID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:29:13.077058 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:13.077005 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" podUID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:29:23.077025 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:23.076983 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" podUID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:29:25.354346 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:25.354296 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" podUID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 18:29:35.354702 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:35.354669 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" Apr 23 18:29:41.070275 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:41.070235 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh"] Apr 23 18:29:41.070847 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:41.070581 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" podUID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerName="kserve-container" containerID="cri-o://cef9b52167e9284b3881c22b23c21e7e17b73d0ba32673ad96d25b8489295b52" gracePeriod=30 Apr 23 18:29:41.070847 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:41.070633 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" podUID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerName="kube-rbac-proxy" containerID="cri-o://69a9d66db0b8449f6bd33f8751a6f70fc258321b8e0c87459c7f9f6a056666fd" gracePeriod=30 Apr 23 18:29:41.359613 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:41.359493 2578 generic.go:358] "Generic (PLEG): container finished" podID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerID="69a9d66db0b8449f6bd33f8751a6f70fc258321b8e0c87459c7f9f6a056666fd" exitCode=2 Apr 23 18:29:41.359613 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:41.359574 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" event={"ID":"1ac663fb-4900-4bbc-9deb-caf55bca6bdd","Type":"ContainerDied","Data":"69a9d66db0b8449f6bd33f8751a6f70fc258321b8e0c87459c7f9f6a056666fd"} Apr 23 18:29:43.073283 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:43.073233 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" podUID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.35:8643/healthz\": dial tcp 10.133.0.35:8643: connect: connection refused" Apr 23 18:29:44.915176 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:44.915148 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" Apr 23 18:29:44.946795 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:44.946757 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-kserve-provision-location\") pod \"1ac663fb-4900-4bbc-9deb-caf55bca6bdd\" (UID: \"1ac663fb-4900-4bbc-9deb-caf55bca6bdd\") " Apr 23 18:29:44.946995 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:44.946814 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srw67\" (UniqueName: \"kubernetes.io/projected/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-kube-api-access-srw67\") pod \"1ac663fb-4900-4bbc-9deb-caf55bca6bdd\" (UID: \"1ac663fb-4900-4bbc-9deb-caf55bca6bdd\") " Apr 23 18:29:44.946995 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:44.946832 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-proxy-tls\") pod \"1ac663fb-4900-4bbc-9deb-caf55bca6bdd\" (UID: \"1ac663fb-4900-4bbc-9deb-caf55bca6bdd\") " Apr 23 18:29:44.946995 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:44.946851 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"1ac663fb-4900-4bbc-9deb-caf55bca6bdd\" (UID: \"1ac663fb-4900-4bbc-9deb-caf55bca6bdd\") " Apr 23 18:29:44.947181 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:44.947142 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1ac663fb-4900-4bbc-9deb-caf55bca6bdd" (UID: "1ac663fb-4900-4bbc-9deb-caf55bca6bdd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:29:44.947238 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:44.947186 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config") pod "1ac663fb-4900-4bbc-9deb-caf55bca6bdd" (UID: "1ac663fb-4900-4bbc-9deb-caf55bca6bdd"). InnerVolumeSpecName "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:29:44.949014 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:44.948986 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1ac663fb-4900-4bbc-9deb-caf55bca6bdd" (UID: "1ac663fb-4900-4bbc-9deb-caf55bca6bdd"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:29:44.949127 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:44.949053 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-kube-api-access-srw67" (OuterVolumeSpecName: "kube-api-access-srw67") pod "1ac663fb-4900-4bbc-9deb-caf55bca6bdd" (UID: "1ac663fb-4900-4bbc-9deb-caf55bca6bdd"). InnerVolumeSpecName "kube-api-access-srw67". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:29:45.047419 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:45.047386 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-kserve-provision-location\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:29:45.047419 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:45.047418 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-srw67\" (UniqueName: \"kubernetes.io/projected/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-kube-api-access-srw67\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:29:45.047648 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:45.047428 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-proxy-tls\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:29:45.047648 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:45.047440 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1ac663fb-4900-4bbc-9deb-caf55bca6bdd-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:29:45.377264 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:45.377169 2578 generic.go:358] "Generic (PLEG): container finished" podID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerID="cef9b52167e9284b3881c22b23c21e7e17b73d0ba32673ad96d25b8489295b52" exitCode=0 Apr 23 18:29:45.377264 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:45.377255 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" Apr 23 18:29:45.377464 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:45.377255 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" event={"ID":"1ac663fb-4900-4bbc-9deb-caf55bca6bdd","Type":"ContainerDied","Data":"cef9b52167e9284b3881c22b23c21e7e17b73d0ba32673ad96d25b8489295b52"} Apr 23 18:29:45.377464 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:45.377299 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh" event={"ID":"1ac663fb-4900-4bbc-9deb-caf55bca6bdd","Type":"ContainerDied","Data":"6c43a9b60e188ffa193ca9b918fd67fcf80883a0959a2b26fb762f74eade1a8b"} Apr 23 18:29:45.377464 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:45.377315 2578 scope.go:117] "RemoveContainer" containerID="69a9d66db0b8449f6bd33f8751a6f70fc258321b8e0c87459c7f9f6a056666fd" Apr 23 18:29:45.386125 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:45.386103 2578 scope.go:117] "RemoveContainer" containerID="cef9b52167e9284b3881c22b23c21e7e17b73d0ba32673ad96d25b8489295b52" Apr 23 18:29:45.394255 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:45.394233 2578 scope.go:117] "RemoveContainer" containerID="0ce655781572cf962403ec144ff173dee60e5af3f1ddc94c06fbc2bec69ef74f" Apr 23 18:29:45.400711 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:45.400675 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh"] Apr 23 18:29:45.403340 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:45.403312 2578 scope.go:117] "RemoveContainer" containerID="69a9d66db0b8449f6bd33f8751a6f70fc258321b8e0c87459c7f9f6a056666fd" Apr 23 18:29:45.403760 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:29:45.403736 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69a9d66db0b8449f6bd33f8751a6f70fc258321b8e0c87459c7f9f6a056666fd\": container with ID starting with 69a9d66db0b8449f6bd33f8751a6f70fc258321b8e0c87459c7f9f6a056666fd not found: ID does not exist" containerID="69a9d66db0b8449f6bd33f8751a6f70fc258321b8e0c87459c7f9f6a056666fd" Apr 23 18:29:45.403844 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:45.403770 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69a9d66db0b8449f6bd33f8751a6f70fc258321b8e0c87459c7f9f6a056666fd"} err="failed to get container status \"69a9d66db0b8449f6bd33f8751a6f70fc258321b8e0c87459c7f9f6a056666fd\": rpc error: code = NotFound desc = could not find container \"69a9d66db0b8449f6bd33f8751a6f70fc258321b8e0c87459c7f9f6a056666fd\": container with ID starting with 69a9d66db0b8449f6bd33f8751a6f70fc258321b8e0c87459c7f9f6a056666fd not found: ID does not exist" Apr 23 18:29:45.403844 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:45.403796 2578 scope.go:117] "RemoveContainer" containerID="cef9b52167e9284b3881c22b23c21e7e17b73d0ba32673ad96d25b8489295b52" Apr 23 18:29:45.404119 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:29:45.404093 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cef9b52167e9284b3881c22b23c21e7e17b73d0ba32673ad96d25b8489295b52\": container with ID starting with cef9b52167e9284b3881c22b23c21e7e17b73d0ba32673ad96d25b8489295b52 not found: ID does not exist" containerID="cef9b52167e9284b3881c22b23c21e7e17b73d0ba32673ad96d25b8489295b52" Apr 23 18:29:45.404168 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:45.404128 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cef9b52167e9284b3881c22b23c21e7e17b73d0ba32673ad96d25b8489295b52"} err="failed to get container status \"cef9b52167e9284b3881c22b23c21e7e17b73d0ba32673ad96d25b8489295b52\": rpc error: code = NotFound desc = could not find container \"cef9b52167e9284b3881c22b23c21e7e17b73d0ba32673ad96d25b8489295b52\": container with ID starting with cef9b52167e9284b3881c22b23c21e7e17b73d0ba32673ad96d25b8489295b52 not found: ID does not exist" Apr 23 18:29:45.404168 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:45.404150 2578 scope.go:117] "RemoveContainer" containerID="0ce655781572cf962403ec144ff173dee60e5af3f1ddc94c06fbc2bec69ef74f" Apr 23 18:29:45.404412 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:29:45.404391 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ce655781572cf962403ec144ff173dee60e5af3f1ddc94c06fbc2bec69ef74f\": container with ID starting with 0ce655781572cf962403ec144ff173dee60e5af3f1ddc94c06fbc2bec69ef74f not found: ID does not exist" containerID="0ce655781572cf962403ec144ff173dee60e5af3f1ddc94c06fbc2bec69ef74f" Apr 23 18:29:45.404461 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:45.404420 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ce655781572cf962403ec144ff173dee60e5af3f1ddc94c06fbc2bec69ef74f"} err="failed to get container status \"0ce655781572cf962403ec144ff173dee60e5af3f1ddc94c06fbc2bec69ef74f\": rpc error: code = NotFound desc = could not find container \"0ce655781572cf962403ec144ff173dee60e5af3f1ddc94c06fbc2bec69ef74f\": container with ID starting with 0ce655781572cf962403ec144ff173dee60e5af3f1ddc94c06fbc2bec69ef74f not found: ID does not exist" Apr 23 18:29:45.404523 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:45.404507 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-vsfbh"] Apr 23 18:29:46.357899 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:29:46.357864 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" path="/var/lib/kubelet/pods/1ac663fb-4900-4bbc-9deb-caf55bca6bdd/volumes" Apr 23 18:31:29.726557 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:29.726438 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn"] Apr 23 18:31:29.727100 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:29.726851 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerName="kube-rbac-proxy" Apr 23 18:31:29.727100 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:29.726866 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerName="kube-rbac-proxy" Apr 23 18:31:29.727100 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:29.726883 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerName="kserve-container" Apr 23 18:31:29.727100 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:29.726888 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerName="kserve-container" Apr 23 18:31:29.727100 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:29.726897 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerName="storage-initializer" Apr 23 18:31:29.727100 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:29.726911 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerName="storage-initializer" Apr 23 18:31:29.727100 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:29.726971 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerName="kserve-container" Apr 23 18:31:29.727100 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:29.726984 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ac663fb-4900-4bbc-9deb-caf55bca6bdd" containerName="kube-rbac-proxy" Apr 23 18:31:29.730173 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:29.730154 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" Apr 23 18:31:29.732455 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:29.732431 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-predictor-serving-cert\"" Apr 23 18:31:29.732455 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:29.732442 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\"" Apr 23 18:31:29.732669 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:29.732507 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:31:29.732721 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:29.732668 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 18:31:29.733032 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:29.733007 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t4fg7\"" Apr 23 18:31:29.742616 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:29.742580 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn"] Apr 23 18:31:29.915908 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:29.915859 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c7lg\" (UniqueName: \"kubernetes.io/projected/a24fe136-b454-4c80-8b19-6add7ed87987-kube-api-access-9c7lg\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn\" (UID: \"a24fe136-b454-4c80-8b19-6add7ed87987\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" Apr 23 18:31:29.915908 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:29.915915 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a24fe136-b454-4c80-8b19-6add7ed87987-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn\" (UID: \"a24fe136-b454-4c80-8b19-6add7ed87987\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" Apr 23 18:31:29.916143 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:29.916039 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a24fe136-b454-4c80-8b19-6add7ed87987-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn\" (UID: \"a24fe136-b454-4c80-8b19-6add7ed87987\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" Apr 23 18:31:29.916143 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:29.916073 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a24fe136-b454-4c80-8b19-6add7ed87987-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn\" (UID: \"a24fe136-b454-4c80-8b19-6add7ed87987\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" Apr 23 18:31:30.017256 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:30.017219 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a24fe136-b454-4c80-8b19-6add7ed87987-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn\" (UID: \"a24fe136-b454-4c80-8b19-6add7ed87987\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" Apr 23 18:31:30.017420 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:30.017268 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9c7lg\" (UniqueName: \"kubernetes.io/projected/a24fe136-b454-4c80-8b19-6add7ed87987-kube-api-access-9c7lg\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn\" (UID: \"a24fe136-b454-4c80-8b19-6add7ed87987\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" Apr 23 18:31:30.017420 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:30.017301 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a24fe136-b454-4c80-8b19-6add7ed87987-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn\" (UID: \"a24fe136-b454-4c80-8b19-6add7ed87987\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" Apr 23 18:31:30.017420 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:30.017359 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a24fe136-b454-4c80-8b19-6add7ed87987-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn\" (UID: \"a24fe136-b454-4c80-8b19-6add7ed87987\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" Apr 23 18:31:30.017640 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:31:30.017464 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-serving-cert: secret "isvc-predictive-sklearn-predictor-serving-cert" not found Apr 23 18:31:30.017640 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:31:30.017560 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a24fe136-b454-4c80-8b19-6add7ed87987-proxy-tls podName:a24fe136-b454-4c80-8b19-6add7ed87987 nodeName:}" failed. No retries permitted until 2026-04-23 18:31:30.517516602 +0000 UTC m=+1974.751258495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a24fe136-b454-4c80-8b19-6add7ed87987-proxy-tls") pod "isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" (UID: "a24fe136-b454-4c80-8b19-6add7ed87987") : secret "isvc-predictive-sklearn-predictor-serving-cert" not found Apr 23 18:31:30.017831 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:30.017802 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a24fe136-b454-4c80-8b19-6add7ed87987-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn\" (UID: \"a24fe136-b454-4c80-8b19-6add7ed87987\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" Apr 23 18:31:30.018065 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:30.018043 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a24fe136-b454-4c80-8b19-6add7ed87987-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn\" (UID: \"a24fe136-b454-4c80-8b19-6add7ed87987\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" Apr 23 18:31:30.026088 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:30.026064 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c7lg\" (UniqueName: \"kubernetes.io/projected/a24fe136-b454-4c80-8b19-6add7ed87987-kube-api-access-9c7lg\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn\" (UID: \"a24fe136-b454-4c80-8b19-6add7ed87987\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" Apr 23 18:31:30.521583 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:30.521550 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a24fe136-b454-4c80-8b19-6add7ed87987-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn\" (UID: \"a24fe136-b454-4c80-8b19-6add7ed87987\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" Apr 23 18:31:30.523816 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:30.523789 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a24fe136-b454-4c80-8b19-6add7ed87987-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn\" (UID: \"a24fe136-b454-4c80-8b19-6add7ed87987\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" Apr 23 18:31:30.641069 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:30.641012 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" Apr 23 18:31:30.771261 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:30.771185 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn"] Apr 23 18:31:30.773584 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:31:30.773508 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda24fe136_b454_4c80_8b19_6add7ed87987.slice/crio-39f0e44bbbc8d787b11907dc5b6b16ce918d6c7943f91215f5cab71f718d326a WatchSource:0}: Error finding container 39f0e44bbbc8d787b11907dc5b6b16ce918d6c7943f91215f5cab71f718d326a: Status 404 returned error can't find the container with id 39f0e44bbbc8d787b11907dc5b6b16ce918d6c7943f91215f5cab71f718d326a Apr 23 18:31:31.723268 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:31.723227 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" event={"ID":"a24fe136-b454-4c80-8b19-6add7ed87987","Type":"ContainerStarted","Data":"d821fce7989f5abbbcc7d853f6cf8c47a25a7647d8f7c6e5c4dc9812bd4e16ee"} Apr 23 18:31:31.723268 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:31.723269 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" event={"ID":"a24fe136-b454-4c80-8b19-6add7ed87987","Type":"ContainerStarted","Data":"39f0e44bbbc8d787b11907dc5b6b16ce918d6c7943f91215f5cab71f718d326a"} Apr 23 18:31:34.734900 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:34.734866 2578 generic.go:358] "Generic (PLEG): container finished" podID="a24fe136-b454-4c80-8b19-6add7ed87987" containerID="d821fce7989f5abbbcc7d853f6cf8c47a25a7647d8f7c6e5c4dc9812bd4e16ee" exitCode=0 Apr 23 18:31:34.735305 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:34.734939 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" event={"ID":"a24fe136-b454-4c80-8b19-6add7ed87987","Type":"ContainerDied","Data":"d821fce7989f5abbbcc7d853f6cf8c47a25a7647d8f7c6e5c4dc9812bd4e16ee"} Apr 23 18:31:56.826104 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:56.826065 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" event={"ID":"a24fe136-b454-4c80-8b19-6add7ed87987","Type":"ContainerStarted","Data":"5661d68a5a841694c981545b191d2b1966da55c343fca7275572a926a4ee302f"} Apr 23 18:31:56.826104 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:56.826109 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" event={"ID":"a24fe136-b454-4c80-8b19-6add7ed87987","Type":"ContainerStarted","Data":"57ac76a0068d8af737fbf5802e1a5460e2319c2058a9603133ad0983b41c282d"} Apr 23 18:31:56.826612 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:56.826340 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" Apr 23 18:31:56.844722 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:56.844657 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" podStartSLOduration=6.856060091 podStartE2EDuration="27.844627111s" podCreationTimestamp="2026-04-23 18:31:29 +0000 UTC" firstStartedPulling="2026-04-23 18:31:34.736099827 +0000 UTC m=+1978.969841722" lastFinishedPulling="2026-04-23 18:31:55.724666848 +0000 UTC m=+1999.958408742" observedRunningTime="2026-04-23 18:31:56.843183823 +0000 UTC m=+2001.076925738" watchObservedRunningTime="2026-04-23 18:31:56.844627111 +0000 UTC m=+2001.078369029" Apr 23 18:31:57.829396 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:57.829360 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" Apr 23 18:31:57.830587 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:57.830559 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" podUID="a24fe136-b454-4c80-8b19-6add7ed87987" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:31:58.832732 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:31:58.832697 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" podUID="a24fe136-b454-4c80-8b19-6add7ed87987" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:32:03.837025 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:32:03.836991 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" Apr 23 18:32:03.837582 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:32:03.837550 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" podUID="a24fe136-b454-4c80-8b19-6add7ed87987" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:32:13.837621 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:32:13.837576 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" podUID="a24fe136-b454-4c80-8b19-6add7ed87987" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:32:23.838394 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:32:23.838337 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" podUID="a24fe136-b454-4c80-8b19-6add7ed87987" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:32:33.837603 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:32:33.837559 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" podUID="a24fe136-b454-4c80-8b19-6add7ed87987" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:32:43.837584 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:32:43.837517 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" podUID="a24fe136-b454-4c80-8b19-6add7ed87987" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:32:53.837864 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:32:53.837822 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" podUID="a24fe136-b454-4c80-8b19-6add7ed87987" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:33:03.838431 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:03.838343 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" podUID="a24fe136-b454-4c80-8b19-6add7ed87987" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:33:13.838817 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:13.838783 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" Apr 23 18:33:19.869857 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:19.869822 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn"] Apr 23 18:33:19.870439 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:19.870257 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" podUID="a24fe136-b454-4c80-8b19-6add7ed87987" containerName="kserve-container" containerID="cri-o://57ac76a0068d8af737fbf5802e1a5460e2319c2058a9603133ad0983b41c282d" gracePeriod=30 Apr 23 18:33:19.870517 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:19.870433 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" podUID="a24fe136-b454-4c80-8b19-6add7ed87987" containerName="kube-rbac-proxy" containerID="cri-o://5661d68a5a841694c981545b191d2b1966da55c343fca7275572a926a4ee302f" gracePeriod=30 Apr 23 18:33:19.969223 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:19.969183 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k"] Apr 23 18:33:19.973023 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:19.973000 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" Apr 23 18:33:19.974888 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:19.974865 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-predictor-serving-cert\"" Apr 23 18:33:19.975102 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:19.975075 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\"" Apr 23 18:33:19.983073 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:19.983049 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k"] Apr 23 18:33:20.056176 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:20.056133 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b97cc30-2c8c-43cd-9f9d-47146ad06629-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k\" (UID: \"5b97cc30-2c8c-43cd-9f9d-47146ad06629\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" Apr 23 18:33:20.056379 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:20.056212 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttwtq\" (UniqueName: \"kubernetes.io/projected/5b97cc30-2c8c-43cd-9f9d-47146ad06629-kube-api-access-ttwtq\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k\" (UID: \"5b97cc30-2c8c-43cd-9f9d-47146ad06629\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" Apr 23 18:33:20.056379 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:20.056247 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b97cc30-2c8c-43cd-9f9d-47146ad06629-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k\" (UID: \"5b97cc30-2c8c-43cd-9f9d-47146ad06629\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" Apr 23 18:33:20.056379 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:20.056277 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b97cc30-2c8c-43cd-9f9d-47146ad06629-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k\" (UID: \"5b97cc30-2c8c-43cd-9f9d-47146ad06629\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" Apr 23 18:33:20.105521 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:20.105485 2578 generic.go:358] "Generic (PLEG): container finished" podID="a24fe136-b454-4c80-8b19-6add7ed87987" containerID="5661d68a5a841694c981545b191d2b1966da55c343fca7275572a926a4ee302f" exitCode=2 Apr 23 18:33:20.105749 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:20.105558 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" event={"ID":"a24fe136-b454-4c80-8b19-6add7ed87987","Type":"ContainerDied","Data":"5661d68a5a841694c981545b191d2b1966da55c343fca7275572a926a4ee302f"} Apr 23 18:33:20.157148 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:20.157063 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttwtq\" (UniqueName: \"kubernetes.io/projected/5b97cc30-2c8c-43cd-9f9d-47146ad06629-kube-api-access-ttwtq\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k\" (UID: \"5b97cc30-2c8c-43cd-9f9d-47146ad06629\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" Apr 23 18:33:20.157148 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:20.157125 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b97cc30-2c8c-43cd-9f9d-47146ad06629-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k\" (UID: \"5b97cc30-2c8c-43cd-9f9d-47146ad06629\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" Apr 23 18:33:20.157351 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:20.157155 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b97cc30-2c8c-43cd-9f9d-47146ad06629-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k\" (UID: \"5b97cc30-2c8c-43cd-9f9d-47146ad06629\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" Apr 23 18:33:20.157351 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:20.157206 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b97cc30-2c8c-43cd-9f9d-47146ad06629-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k\" (UID: \"5b97cc30-2c8c-43cd-9f9d-47146ad06629\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" Apr 23 18:33:20.157351 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:33:20.157344 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-serving-cert: secret "isvc-predictive-xgboost-predictor-serving-cert" not found Apr 23 18:33:20.157498 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:33:20.157408 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b97cc30-2c8c-43cd-9f9d-47146ad06629-proxy-tls podName:5b97cc30-2c8c-43cd-9f9d-47146ad06629 nodeName:}" failed. No retries permitted until 2026-04-23 18:33:20.657394088 +0000 UTC m=+2084.891135983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5b97cc30-2c8c-43cd-9f9d-47146ad06629-proxy-tls") pod "isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" (UID: "5b97cc30-2c8c-43cd-9f9d-47146ad06629") : secret "isvc-predictive-xgboost-predictor-serving-cert" not found Apr 23 18:33:20.157690 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:20.157664 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b97cc30-2c8c-43cd-9f9d-47146ad06629-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k\" (UID: \"5b97cc30-2c8c-43cd-9f9d-47146ad06629\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" Apr 23 18:33:20.157909 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:20.157891 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b97cc30-2c8c-43cd-9f9d-47146ad06629-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k\" (UID: \"5b97cc30-2c8c-43cd-9f9d-47146ad06629\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" Apr 23 18:33:20.166053 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:20.166028 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttwtq\" (UniqueName: \"kubernetes.io/projected/5b97cc30-2c8c-43cd-9f9d-47146ad06629-kube-api-access-ttwtq\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k\" (UID: \"5b97cc30-2c8c-43cd-9f9d-47146ad06629\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" Apr 23 18:33:20.660925 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:20.660888 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b97cc30-2c8c-43cd-9f9d-47146ad06629-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k\" (UID: \"5b97cc30-2c8c-43cd-9f9d-47146ad06629\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" Apr 23 18:33:20.663454 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:20.663434 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b97cc30-2c8c-43cd-9f9d-47146ad06629-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k\" (UID: \"5b97cc30-2c8c-43cd-9f9d-47146ad06629\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" Apr 23 18:33:20.885647 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:20.885602 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" Apr 23 18:33:21.020069 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:21.020040 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k"] Apr 23 18:33:21.023031 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:33:21.022998 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b97cc30_2c8c_43cd_9f9d_47146ad06629.slice/crio-809b5763dc6bbe1bacdb8862dab8a16adf2057a8ec1f51ca186193c35962d70f WatchSource:0}: Error finding container 809b5763dc6bbe1bacdb8862dab8a16adf2057a8ec1f51ca186193c35962d70f: Status 404 returned error can't find the container with id 809b5763dc6bbe1bacdb8862dab8a16adf2057a8ec1f51ca186193c35962d70f Apr 23 18:33:21.025367 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:21.025346 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:33:21.110602 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:21.110566 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" event={"ID":"5b97cc30-2c8c-43cd-9f9d-47146ad06629","Type":"ContainerStarted","Data":"5ed64269decc571ca9396d6767c650b38a2f5061fe618dca4fb30059fc41dbe1"} Apr 23 18:33:21.110602 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:21.110605 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" event={"ID":"5b97cc30-2c8c-43cd-9f9d-47146ad06629","Type":"ContainerStarted","Data":"809b5763dc6bbe1bacdb8862dab8a16adf2057a8ec1f51ca186193c35962d70f"} Apr 23 18:33:23.833058 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:23.833012 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" podUID="a24fe136-b454-4c80-8b19-6add7ed87987" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.36:8643/healthz\": dial tcp 10.133.0.36:8643: connect: connection refused" Apr 23 18:33:23.838024 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:23.837984 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" podUID="a24fe136-b454-4c80-8b19-6add7ed87987" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 18:33:25.123076 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.123037 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" Apr 23 18:33:25.125566 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.125520 2578 generic.go:358] "Generic (PLEG): container finished" podID="a24fe136-b454-4c80-8b19-6add7ed87987" containerID="57ac76a0068d8af737fbf5802e1a5460e2319c2058a9603133ad0983b41c282d" exitCode=0 Apr 23 18:33:25.125724 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.125613 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" Apr 23 18:33:25.125724 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.125624 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" event={"ID":"a24fe136-b454-4c80-8b19-6add7ed87987","Type":"ContainerDied","Data":"57ac76a0068d8af737fbf5802e1a5460e2319c2058a9603133ad0983b41c282d"} Apr 23 18:33:25.125724 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.125659 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn" event={"ID":"a24fe136-b454-4c80-8b19-6add7ed87987","Type":"ContainerDied","Data":"39f0e44bbbc8d787b11907dc5b6b16ce918d6c7943f91215f5cab71f718d326a"} Apr 23 18:33:25.125724 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.125679 2578 scope.go:117] "RemoveContainer" containerID="5661d68a5a841694c981545b191d2b1966da55c343fca7275572a926a4ee302f" Apr 23 18:33:25.127285 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.127257 2578 generic.go:358] "Generic (PLEG): container finished" podID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerID="5ed64269decc571ca9396d6767c650b38a2f5061fe618dca4fb30059fc41dbe1" exitCode=0 Apr 23 18:33:25.127414 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.127316 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" event={"ID":"5b97cc30-2c8c-43cd-9f9d-47146ad06629","Type":"ContainerDied","Data":"5ed64269decc571ca9396d6767c650b38a2f5061fe618dca4fb30059fc41dbe1"} Apr 23 18:33:25.134392 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.134365 2578 scope.go:117] "RemoveContainer" containerID="57ac76a0068d8af737fbf5802e1a5460e2319c2058a9603133ad0983b41c282d" Apr 23 18:33:25.146268 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.146243 2578 scope.go:117] "RemoveContainer" containerID="d821fce7989f5abbbcc7d853f6cf8c47a25a7647d8f7c6e5c4dc9812bd4e16ee" Apr 23 18:33:25.162145 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.162091 2578 scope.go:117] "RemoveContainer" containerID="5661d68a5a841694c981545b191d2b1966da55c343fca7275572a926a4ee302f" Apr 23 18:33:25.162497 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:33:25.162470 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5661d68a5a841694c981545b191d2b1966da55c343fca7275572a926a4ee302f\": container with ID starting with 5661d68a5a841694c981545b191d2b1966da55c343fca7275572a926a4ee302f not found: ID does not exist" containerID="5661d68a5a841694c981545b191d2b1966da55c343fca7275572a926a4ee302f" Apr 23 18:33:25.162649 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.162511 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5661d68a5a841694c981545b191d2b1966da55c343fca7275572a926a4ee302f"} err="failed to get container status \"5661d68a5a841694c981545b191d2b1966da55c343fca7275572a926a4ee302f\": rpc error: code = NotFound desc = could not find container \"5661d68a5a841694c981545b191d2b1966da55c343fca7275572a926a4ee302f\": container with ID starting with 5661d68a5a841694c981545b191d2b1966da55c343fca7275572a926a4ee302f not found: ID does not exist" Apr 23 18:33:25.162649 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.162557 2578 scope.go:117] "RemoveContainer" containerID="57ac76a0068d8af737fbf5802e1a5460e2319c2058a9603133ad0983b41c282d" Apr 23 18:33:25.162919 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:33:25.162892 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57ac76a0068d8af737fbf5802e1a5460e2319c2058a9603133ad0983b41c282d\": container with ID starting with 57ac76a0068d8af737fbf5802e1a5460e2319c2058a9603133ad0983b41c282d not found: ID does not exist" containerID="57ac76a0068d8af737fbf5802e1a5460e2319c2058a9603133ad0983b41c282d" Apr 23 18:33:25.163011 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.162929 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57ac76a0068d8af737fbf5802e1a5460e2319c2058a9603133ad0983b41c282d"} err="failed to get container status \"57ac76a0068d8af737fbf5802e1a5460e2319c2058a9603133ad0983b41c282d\": rpc error: code = NotFound desc = could not find container \"57ac76a0068d8af737fbf5802e1a5460e2319c2058a9603133ad0983b41c282d\": container with ID starting with 57ac76a0068d8af737fbf5802e1a5460e2319c2058a9603133ad0983b41c282d not found: ID does not exist" Apr 23 18:33:25.163011 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.162961 2578 scope.go:117] "RemoveContainer" containerID="d821fce7989f5abbbcc7d853f6cf8c47a25a7647d8f7c6e5c4dc9812bd4e16ee" Apr 23 18:33:25.163280 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:33:25.163259 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d821fce7989f5abbbcc7d853f6cf8c47a25a7647d8f7c6e5c4dc9812bd4e16ee\": container with ID starting with d821fce7989f5abbbcc7d853f6cf8c47a25a7647d8f7c6e5c4dc9812bd4e16ee not found: ID does not exist" containerID="d821fce7989f5abbbcc7d853f6cf8c47a25a7647d8f7c6e5c4dc9812bd4e16ee" Apr 23 18:33:25.163331 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.163300 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d821fce7989f5abbbcc7d853f6cf8c47a25a7647d8f7c6e5c4dc9812bd4e16ee"} err="failed to get container status \"d821fce7989f5abbbcc7d853f6cf8c47a25a7647d8f7c6e5c4dc9812bd4e16ee\": rpc error: code = NotFound desc = could not find container \"d821fce7989f5abbbcc7d853f6cf8c47a25a7647d8f7c6e5c4dc9812bd4e16ee\": container with ID starting with d821fce7989f5abbbcc7d853f6cf8c47a25a7647d8f7c6e5c4dc9812bd4e16ee not found: ID does not exist" Apr 23 18:33:25.200465 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.200441 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a24fe136-b454-4c80-8b19-6add7ed87987-kserve-provision-location\") pod \"a24fe136-b454-4c80-8b19-6add7ed87987\" (UID: \"a24fe136-b454-4c80-8b19-6add7ed87987\") " Apr 23 18:33:25.200626 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.200480 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a24fe136-b454-4c80-8b19-6add7ed87987-proxy-tls\") pod \"a24fe136-b454-4c80-8b19-6add7ed87987\" (UID: \"a24fe136-b454-4c80-8b19-6add7ed87987\") " Apr 23 18:33:25.200626 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.200564 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c7lg\" (UniqueName: \"kubernetes.io/projected/a24fe136-b454-4c80-8b19-6add7ed87987-kube-api-access-9c7lg\") pod \"a24fe136-b454-4c80-8b19-6add7ed87987\" (UID: \"a24fe136-b454-4c80-8b19-6add7ed87987\") " Apr 23 18:33:25.200626 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.200598 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a24fe136-b454-4c80-8b19-6add7ed87987-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"a24fe136-b454-4c80-8b19-6add7ed87987\" (UID: \"a24fe136-b454-4c80-8b19-6add7ed87987\") " Apr 23 18:33:25.200824 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.200797 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a24fe136-b454-4c80-8b19-6add7ed87987-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a24fe136-b454-4c80-8b19-6add7ed87987" (UID: "a24fe136-b454-4c80-8b19-6add7ed87987"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:33:25.201206 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.201058 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a24fe136-b454-4c80-8b19-6add7ed87987-kserve-provision-location\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:33:25.201206 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.201069 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a24fe136-b454-4c80-8b19-6add7ed87987-isvc-predictive-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-kube-rbac-proxy-sar-config") pod "a24fe136-b454-4c80-8b19-6add7ed87987" (UID: "a24fe136-b454-4c80-8b19-6add7ed87987"). InnerVolumeSpecName "isvc-predictive-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:33:25.202727 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.202702 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a24fe136-b454-4c80-8b19-6add7ed87987-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a24fe136-b454-4c80-8b19-6add7ed87987" (UID: "a24fe136-b454-4c80-8b19-6add7ed87987"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:33:25.203005 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.202985 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a24fe136-b454-4c80-8b19-6add7ed87987-kube-api-access-9c7lg" (OuterVolumeSpecName: "kube-api-access-9c7lg") pod "a24fe136-b454-4c80-8b19-6add7ed87987" (UID: "a24fe136-b454-4c80-8b19-6add7ed87987"). InnerVolumeSpecName "kube-api-access-9c7lg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:33:25.301793 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.301754 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9c7lg\" (UniqueName: \"kubernetes.io/projected/a24fe136-b454-4c80-8b19-6add7ed87987-kube-api-access-9c7lg\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:33:25.301793 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.301789 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a24fe136-b454-4c80-8b19-6add7ed87987-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:33:25.301793 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.301801 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a24fe136-b454-4c80-8b19-6add7ed87987-proxy-tls\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:33:25.448177 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.448137 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn"] Apr 23 18:33:25.452355 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:25.452320 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-hk6xn"] Apr 23 18:33:26.133339 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:26.133303 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" event={"ID":"5b97cc30-2c8c-43cd-9f9d-47146ad06629","Type":"ContainerStarted","Data":"ab1856fcac7930735f56211bd36527da8646ea283fc95d945adb95f57f6ab3bc"} Apr 23 18:33:26.133339 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:26.133345 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" event={"ID":"5b97cc30-2c8c-43cd-9f9d-47146ad06629","Type":"ContainerStarted","Data":"c4cd22c7e529b5d3b75e9a27fccb700a7f9d75a8aba88ae58a38acd1964800ab"} Apr 23 18:33:26.133910 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:26.133602 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" Apr 23 18:33:26.151738 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:26.151686 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" podStartSLOduration=7.151666509 podStartE2EDuration="7.151666509s" podCreationTimestamp="2026-04-23 18:33:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:33:26.150739914 +0000 UTC m=+2090.384481829" watchObservedRunningTime="2026-04-23 18:33:26.151666509 +0000 UTC m=+2090.385408428" Apr 23 18:33:26.359310 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:26.359278 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a24fe136-b454-4c80-8b19-6add7ed87987" path="/var/lib/kubelet/pods/a24fe136-b454-4c80-8b19-6add7ed87987/volumes" Apr 23 18:33:27.136387 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:27.136354 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" Apr 23 18:33:27.137776 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:27.137750 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" podUID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:33:28.140035 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:28.139985 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" podUID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:33:33.144743 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:33.144712 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" Apr 23 18:33:33.145325 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:33.145298 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" podUID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:33:36.369666 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:36.369634 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:33:36.376090 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:36.376061 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:33:36.376587 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:36.376565 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:33:36.382415 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:36.382388 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:33:43.146162 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:43.146109 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" podUID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:33:53.146214 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:33:53.146171 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" podUID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:34:03.145649 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:03.145603 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" podUID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:34:13.145815 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:13.145775 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" podUID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:34:23.145879 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:23.145832 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" podUID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:34:33.146343 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:33.146258 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" podUID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:34:43.146668 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:43.146633 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" Apr 23 18:34:50.109673 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.109623 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k"] Apr 23 18:34:50.112179 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.111334 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" podUID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerName="kube-rbac-proxy" containerID="cri-o://ab1856fcac7930735f56211bd36527da8646ea283fc95d945adb95f57f6ab3bc" gracePeriod=30 Apr 23 18:34:50.112179 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.111320 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" podUID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerName="kserve-container" containerID="cri-o://c4cd22c7e529b5d3b75e9a27fccb700a7f9d75a8aba88ae58a38acd1964800ab" gracePeriod=30 Apr 23 18:34:50.225565 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.225497 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq"] Apr 23 18:34:50.225911 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.225881 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a24fe136-b454-4c80-8b19-6add7ed87987" containerName="kserve-container" Apr 23 18:34:50.225911 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.225901 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24fe136-b454-4c80-8b19-6add7ed87987" containerName="kserve-container" Apr 23 18:34:50.226111 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.225935 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a24fe136-b454-4c80-8b19-6add7ed87987" containerName="storage-initializer" Apr 23 18:34:50.226111 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.225943 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24fe136-b454-4c80-8b19-6add7ed87987" containerName="storage-initializer" Apr 23 18:34:50.226111 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.225953 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a24fe136-b454-4c80-8b19-6add7ed87987" containerName="kube-rbac-proxy" Apr 23 18:34:50.226111 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.225961 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24fe136-b454-4c80-8b19-6add7ed87987" containerName="kube-rbac-proxy" Apr 23 18:34:50.226111 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.226040 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a24fe136-b454-4c80-8b19-6add7ed87987" containerName="kserve-container" Apr 23 18:34:50.226111 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.226053 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a24fe136-b454-4c80-8b19-6add7ed87987" containerName="kube-rbac-proxy" Apr 23 18:34:50.229356 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.229327 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" Apr 23 18:34:50.231656 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.231620 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-predictor-serving-cert\"" Apr 23 18:34:50.231752 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.231659 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\"" Apr 23 18:34:50.239798 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.239757 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq"] Apr 23 18:34:50.327492 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.327452 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7c8f938-8e2c-4a56-bb38-7242d668f527-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq\" (UID: \"c7c8f938-8e2c-4a56-bb38-7242d668f527\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" Apr 23 18:34:50.327492 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.327497 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7c8f938-8e2c-4a56-bb38-7242d668f527-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq\" (UID: \"c7c8f938-8e2c-4a56-bb38-7242d668f527\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" Apr 23 18:34:50.327753 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.327566 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7c8f938-8e2c-4a56-bb38-7242d668f527-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq\" (UID: \"c7c8f938-8e2c-4a56-bb38-7242d668f527\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" Apr 23 18:34:50.327753 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.327631 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsjkn\" (UniqueName: \"kubernetes.io/projected/c7c8f938-8e2c-4a56-bb38-7242d668f527-kube-api-access-lsjkn\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq\" (UID: \"c7c8f938-8e2c-4a56-bb38-7242d668f527\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" Apr 23 18:34:50.421200 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.421110 2578 generic.go:358] "Generic (PLEG): container finished" podID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerID="ab1856fcac7930735f56211bd36527da8646ea283fc95d945adb95f57f6ab3bc" exitCode=2 Apr 23 18:34:50.421200 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.421174 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" event={"ID":"5b97cc30-2c8c-43cd-9f9d-47146ad06629","Type":"ContainerDied","Data":"ab1856fcac7930735f56211bd36527da8646ea283fc95d945adb95f57f6ab3bc"} Apr 23 18:34:50.428523 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.428493 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7c8f938-8e2c-4a56-bb38-7242d668f527-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq\" (UID: \"c7c8f938-8e2c-4a56-bb38-7242d668f527\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" Apr 23 18:34:50.428615 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.428555 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lsjkn\" (UniqueName: \"kubernetes.io/projected/c7c8f938-8e2c-4a56-bb38-7242d668f527-kube-api-access-lsjkn\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq\" (UID: \"c7c8f938-8e2c-4a56-bb38-7242d668f527\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" Apr 23 18:34:50.428615 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.428599 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7c8f938-8e2c-4a56-bb38-7242d668f527-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq\" (UID: \"c7c8f938-8e2c-4a56-bb38-7242d668f527\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" Apr 23 18:34:50.428696 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.428623 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7c8f938-8e2c-4a56-bb38-7242d668f527-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq\" (UID: \"c7c8f938-8e2c-4a56-bb38-7242d668f527\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" Apr 23 18:34:50.429091 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.429068 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7c8f938-8e2c-4a56-bb38-7242d668f527-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq\" (UID: \"c7c8f938-8e2c-4a56-bb38-7242d668f527\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" Apr 23 18:34:50.429276 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.429256 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7c8f938-8e2c-4a56-bb38-7242d668f527-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq\" (UID: \"c7c8f938-8e2c-4a56-bb38-7242d668f527\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" Apr 23 18:34:50.431309 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.431287 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7c8f938-8e2c-4a56-bb38-7242d668f527-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq\" (UID: \"c7c8f938-8e2c-4a56-bb38-7242d668f527\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" Apr 23 18:34:50.437389 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.437355 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsjkn\" (UniqueName: \"kubernetes.io/projected/c7c8f938-8e2c-4a56-bb38-7242d668f527-kube-api-access-lsjkn\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq\" (UID: \"c7c8f938-8e2c-4a56-bb38-7242d668f527\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" Apr 23 18:34:50.543986 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.543941 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" Apr 23 18:34:50.682722 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:50.682622 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq"] Apr 23 18:34:50.685127 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:34:50.685098 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7c8f938_8e2c_4a56_bb38_7242d668f527.slice/crio-53d760452ce8eb6da3b95a045b366e46572a7d534a36554315309b180c9c6cd2 WatchSource:0}: Error finding container 53d760452ce8eb6da3b95a045b366e46572a7d534a36554315309b180c9c6cd2: Status 404 returned error can't find the container with id 53d760452ce8eb6da3b95a045b366e46572a7d534a36554315309b180c9c6cd2 Apr 23 18:34:51.425723 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:51.425685 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" event={"ID":"c7c8f938-8e2c-4a56-bb38-7242d668f527","Type":"ContainerStarted","Data":"90b297ba0e3dfef948ea1b2ad2b5a452bcb3cbd4b6ba53a4db3e18a86521ce29"} Apr 23 18:34:51.425723 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:51.425726 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" event={"ID":"c7c8f938-8e2c-4a56-bb38-7242d668f527","Type":"ContainerStarted","Data":"53d760452ce8eb6da3b95a045b366e46572a7d534a36554315309b180c9c6cd2"} Apr 23 18:34:53.140640 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:53.140595 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" podUID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.37:8643/healthz\": dial tcp 10.133.0.37:8643: connect: connection refused" Apr 23 18:34:53.146096 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:53.146060 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" podUID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 18:34:55.367332 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.367304 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" Apr 23 18:34:55.441793 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.441759 2578 generic.go:358] "Generic (PLEG): container finished" podID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerID="c4cd22c7e529b5d3b75e9a27fccb700a7f9d75a8aba88ae58a38acd1964800ab" exitCode=0 Apr 23 18:34:55.442015 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.441847 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" Apr 23 18:34:55.442015 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.441851 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" event={"ID":"5b97cc30-2c8c-43cd-9f9d-47146ad06629","Type":"ContainerDied","Data":"c4cd22c7e529b5d3b75e9a27fccb700a7f9d75a8aba88ae58a38acd1964800ab"} Apr 23 18:34:55.442015 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.441895 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k" event={"ID":"5b97cc30-2c8c-43cd-9f9d-47146ad06629","Type":"ContainerDied","Data":"809b5763dc6bbe1bacdb8862dab8a16adf2057a8ec1f51ca186193c35962d70f"} Apr 23 18:34:55.442015 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.441918 2578 scope.go:117] "RemoveContainer" containerID="ab1856fcac7930735f56211bd36527da8646ea283fc95d945adb95f57f6ab3bc" Apr 23 18:34:55.443553 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.443505 2578 generic.go:358] "Generic (PLEG): container finished" podID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerID="90b297ba0e3dfef948ea1b2ad2b5a452bcb3cbd4b6ba53a4db3e18a86521ce29" exitCode=0 Apr 23 18:34:55.443680 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.443581 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" event={"ID":"c7c8f938-8e2c-4a56-bb38-7242d668f527","Type":"ContainerDied","Data":"90b297ba0e3dfef948ea1b2ad2b5a452bcb3cbd4b6ba53a4db3e18a86521ce29"} Apr 23 18:34:55.451172 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.451145 2578 scope.go:117] "RemoveContainer" containerID="c4cd22c7e529b5d3b75e9a27fccb700a7f9d75a8aba88ae58a38acd1964800ab" Apr 23 18:34:55.459054 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.459030 2578 scope.go:117] "RemoveContainer" containerID="5ed64269decc571ca9396d6767c650b38a2f5061fe618dca4fb30059fc41dbe1" Apr 23 18:34:55.473041 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.473011 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttwtq\" (UniqueName: \"kubernetes.io/projected/5b97cc30-2c8c-43cd-9f9d-47146ad06629-kube-api-access-ttwtq\") pod \"5b97cc30-2c8c-43cd-9f9d-47146ad06629\" (UID: \"5b97cc30-2c8c-43cd-9f9d-47146ad06629\") " Apr 23 18:34:55.473222 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.473082 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b97cc30-2c8c-43cd-9f9d-47146ad06629-kserve-provision-location\") pod \"5b97cc30-2c8c-43cd-9f9d-47146ad06629\" (UID: \"5b97cc30-2c8c-43cd-9f9d-47146ad06629\") " Apr 23 18:34:55.473222 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.473119 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b97cc30-2c8c-43cd-9f9d-47146ad06629-proxy-tls\") pod \"5b97cc30-2c8c-43cd-9f9d-47146ad06629\" (UID: \"5b97cc30-2c8c-43cd-9f9d-47146ad06629\") " Apr 23 18:34:55.473222 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.473185 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b97cc30-2c8c-43cd-9f9d-47146ad06629-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"5b97cc30-2c8c-43cd-9f9d-47146ad06629\" (UID: \"5b97cc30-2c8c-43cd-9f9d-47146ad06629\") " Apr 23 18:34:55.473491 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.473461 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b97cc30-2c8c-43cd-9f9d-47146ad06629-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5b97cc30-2c8c-43cd-9f9d-47146ad06629" (UID: "5b97cc30-2c8c-43cd-9f9d-47146ad06629"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:34:55.474061 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.474029 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b97cc30-2c8c-43cd-9f9d-47146ad06629-isvc-predictive-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-kube-rbac-proxy-sar-config") pod "5b97cc30-2c8c-43cd-9f9d-47146ad06629" (UID: "5b97cc30-2c8c-43cd-9f9d-47146ad06629"). InnerVolumeSpecName "isvc-predictive-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:34:55.474640 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.474618 2578 scope.go:117] "RemoveContainer" containerID="ab1856fcac7930735f56211bd36527da8646ea283fc95d945adb95f57f6ab3bc" Apr 23 18:34:55.475379 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:34:55.475005 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab1856fcac7930735f56211bd36527da8646ea283fc95d945adb95f57f6ab3bc\": container with ID starting with ab1856fcac7930735f56211bd36527da8646ea283fc95d945adb95f57f6ab3bc not found: ID does not exist" containerID="ab1856fcac7930735f56211bd36527da8646ea283fc95d945adb95f57f6ab3bc" Apr 23 18:34:55.475506 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.475383 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab1856fcac7930735f56211bd36527da8646ea283fc95d945adb95f57f6ab3bc"} err="failed to get container status \"ab1856fcac7930735f56211bd36527da8646ea283fc95d945adb95f57f6ab3bc\": rpc error: code = NotFound desc = could not find container \"ab1856fcac7930735f56211bd36527da8646ea283fc95d945adb95f57f6ab3bc\": container with ID starting with ab1856fcac7930735f56211bd36527da8646ea283fc95d945adb95f57f6ab3bc not found: ID does not exist" Apr 23 18:34:55.475506 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.475419 2578 scope.go:117] "RemoveContainer" containerID="c4cd22c7e529b5d3b75e9a27fccb700a7f9d75a8aba88ae58a38acd1964800ab" Apr 23 18:34:55.475902 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:34:55.475828 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4cd22c7e529b5d3b75e9a27fccb700a7f9d75a8aba88ae58a38acd1964800ab\": container with ID starting with c4cd22c7e529b5d3b75e9a27fccb700a7f9d75a8aba88ae58a38acd1964800ab not found: ID does not exist" containerID="c4cd22c7e529b5d3b75e9a27fccb700a7f9d75a8aba88ae58a38acd1964800ab" Apr 23 18:34:55.475902 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.475858 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4cd22c7e529b5d3b75e9a27fccb700a7f9d75a8aba88ae58a38acd1964800ab"} err="failed to get container status \"c4cd22c7e529b5d3b75e9a27fccb700a7f9d75a8aba88ae58a38acd1964800ab\": rpc error: code = NotFound desc = could not find container \"c4cd22c7e529b5d3b75e9a27fccb700a7f9d75a8aba88ae58a38acd1964800ab\": container with ID starting with c4cd22c7e529b5d3b75e9a27fccb700a7f9d75a8aba88ae58a38acd1964800ab not found: ID does not exist" Apr 23 18:34:55.475902 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.475879 2578 scope.go:117] "RemoveContainer" containerID="5ed64269decc571ca9396d6767c650b38a2f5061fe618dca4fb30059fc41dbe1" Apr 23 18:34:55.476113 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.475869 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b97cc30-2c8c-43cd-9f9d-47146ad06629-kube-api-access-ttwtq" (OuterVolumeSpecName: "kube-api-access-ttwtq") pod "5b97cc30-2c8c-43cd-9f9d-47146ad06629" (UID: "5b97cc30-2c8c-43cd-9f9d-47146ad06629"). InnerVolumeSpecName "kube-api-access-ttwtq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:34:55.477066 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:34:55.476698 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ed64269decc571ca9396d6767c650b38a2f5061fe618dca4fb30059fc41dbe1\": container with ID starting with 5ed64269decc571ca9396d6767c650b38a2f5061fe618dca4fb30059fc41dbe1 not found: ID does not exist" containerID="5ed64269decc571ca9396d6767c650b38a2f5061fe618dca4fb30059fc41dbe1" Apr 23 18:34:55.477066 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.476755 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ed64269decc571ca9396d6767c650b38a2f5061fe618dca4fb30059fc41dbe1"} err="failed to get container status \"5ed64269decc571ca9396d6767c650b38a2f5061fe618dca4fb30059fc41dbe1\": rpc error: code = NotFound desc = could not find container \"5ed64269decc571ca9396d6767c650b38a2f5061fe618dca4fb30059fc41dbe1\": container with ID starting with 5ed64269decc571ca9396d6767c650b38a2f5061fe618dca4fb30059fc41dbe1 not found: ID does not exist" Apr 23 18:34:55.477408 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.477368 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b97cc30-2c8c-43cd-9f9d-47146ad06629-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5b97cc30-2c8c-43cd-9f9d-47146ad06629" (UID: "5b97cc30-2c8c-43cd-9f9d-47146ad06629"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:34:55.574098 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.574059 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b97cc30-2c8c-43cd-9f9d-47146ad06629-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:34:55.574098 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.574102 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ttwtq\" (UniqueName: \"kubernetes.io/projected/5b97cc30-2c8c-43cd-9f9d-47146ad06629-kube-api-access-ttwtq\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:34:55.574287 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.574121 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b97cc30-2c8c-43cd-9f9d-47146ad06629-kserve-provision-location\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:34:55.574287 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.574136 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b97cc30-2c8c-43cd-9f9d-47146ad06629-proxy-tls\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:34:55.766650 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.766610 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k"] Apr 23 18:34:55.771668 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:55.771634 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-mcx9k"] Apr 23 18:34:56.359401 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:56.359363 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" path="/var/lib/kubelet/pods/5b97cc30-2c8c-43cd-9f9d-47146ad06629/volumes" Apr 23 18:34:56.449339 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:56.449304 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" event={"ID":"c7c8f938-8e2c-4a56-bb38-7242d668f527","Type":"ContainerStarted","Data":"07f252eb4543a409dabb49b195bb3fa295ac5a48de2c1a6f188f64d342b63fe4"} Apr 23 18:34:56.449339 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:56.449339 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" event={"ID":"c7c8f938-8e2c-4a56-bb38-7242d668f527","Type":"ContainerStarted","Data":"79adfa500c03aab30b789a9a182a005a8075ccd9c8ec98a022a8029a49c9b2db"} Apr 23 18:34:56.449920 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:56.449581 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" Apr 23 18:34:56.466847 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:56.466790 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" podStartSLOduration=6.466770717 podStartE2EDuration="6.466770717s" podCreationTimestamp="2026-04-23 18:34:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:34:56.466287127 +0000 UTC m=+2180.700029044" watchObservedRunningTime="2026-04-23 18:34:56.466770717 +0000 UTC m=+2180.700512626" Apr 23 18:34:57.454087 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:57.454053 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" Apr 23 18:34:57.455353 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:57.455324 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" podUID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 18:34:58.457889 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:34:58.457852 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" podUID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 18:35:03.463059 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:35:03.463024 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" Apr 23 18:35:03.463695 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:35:03.463664 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" podUID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 18:35:13.463663 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:35:13.463619 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" podUID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 18:35:23.464306 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:35:23.464264 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" podUID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 18:35:33.464067 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:35:33.464022 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" podUID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 18:35:43.464592 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:35:43.464519 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" podUID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 18:35:53.464309 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:35:53.464258 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" podUID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 18:36:03.463851 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:03.463748 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" podUID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 18:36:13.464456 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:13.464421 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" Apr 23 18:36:20.302819 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.302769 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq"] Apr 23 18:36:20.303296 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.303240 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" podUID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerName="kserve-container" containerID="cri-o://79adfa500c03aab30b789a9a182a005a8075ccd9c8ec98a022a8029a49c9b2db" gracePeriod=30 Apr 23 18:36:20.303378 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.303272 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" podUID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerName="kube-rbac-proxy" containerID="cri-o://07f252eb4543a409dabb49b195bb3fa295ac5a48de2c1a6f188f64d342b63fe4" gracePeriod=30 Apr 23 18:36:20.436138 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.436098 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4"] Apr 23 18:36:20.436469 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.436457 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerName="storage-initializer" Apr 23 18:36:20.436518 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.436472 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerName="storage-initializer" Apr 23 18:36:20.436518 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.436486 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerName="kserve-container" Apr 23 18:36:20.436518 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.436492 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerName="kserve-container" Apr 23 18:36:20.436518 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.436506 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerName="kube-rbac-proxy" Apr 23 18:36:20.436518 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.436512 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerName="kube-rbac-proxy" Apr 23 18:36:20.436973 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.436583 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerName="kube-rbac-proxy" Apr 23 18:36:20.436973 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.436595 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b97cc30-2c8c-43cd-9f9d-47146ad06629" containerName="kserve-container" Apr 23 18:36:20.439734 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.439708 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" Apr 23 18:36:20.441606 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.441580 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-predictor-serving-cert\"" Apr 23 18:36:20.441737 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.441601 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 23 18:36:20.448574 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.448544 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4"] Apr 23 18:36:20.610812 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.610712 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgdbd\" (UniqueName: \"kubernetes.io/projected/dd23ade7-a87c-49cc-847b-d8a2102dcf48-kube-api-access-vgdbd\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4\" (UID: \"dd23ade7-a87c-49cc-847b-d8a2102dcf48\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" Apr 23 18:36:20.610812 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.610762 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd23ade7-a87c-49cc-847b-d8a2102dcf48-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4\" (UID: \"dd23ade7-a87c-49cc-847b-d8a2102dcf48\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" Apr 23 18:36:20.610812 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.610808 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dd23ade7-a87c-49cc-847b-d8a2102dcf48-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4\" (UID: \"dd23ade7-a87c-49cc-847b-d8a2102dcf48\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" Apr 23 18:36:20.611078 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.610847 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd23ade7-a87c-49cc-847b-d8a2102dcf48-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4\" (UID: \"dd23ade7-a87c-49cc-847b-d8a2102dcf48\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" Apr 23 18:36:20.712110 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.712072 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd23ade7-a87c-49cc-847b-d8a2102dcf48-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4\" (UID: \"dd23ade7-a87c-49cc-847b-d8a2102dcf48\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" Apr 23 18:36:20.712291 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.712143 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdbd\" (UniqueName: \"kubernetes.io/projected/dd23ade7-a87c-49cc-847b-d8a2102dcf48-kube-api-access-vgdbd\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4\" (UID: \"dd23ade7-a87c-49cc-847b-d8a2102dcf48\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" Apr 23 18:36:20.712291 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.712174 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd23ade7-a87c-49cc-847b-d8a2102dcf48-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4\" (UID: \"dd23ade7-a87c-49cc-847b-d8a2102dcf48\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" Apr 23 18:36:20.712291 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.712205 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dd23ade7-a87c-49cc-847b-d8a2102dcf48-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4\" (UID: \"dd23ade7-a87c-49cc-847b-d8a2102dcf48\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" Apr 23 18:36:20.712666 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.712636 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd23ade7-a87c-49cc-847b-d8a2102dcf48-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4\" (UID: \"dd23ade7-a87c-49cc-847b-d8a2102dcf48\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" Apr 23 18:36:20.712934 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.712912 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dd23ade7-a87c-49cc-847b-d8a2102dcf48-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4\" (UID: \"dd23ade7-a87c-49cc-847b-d8a2102dcf48\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" Apr 23 18:36:20.714645 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.714625 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd23ade7-a87c-49cc-847b-d8a2102dcf48-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4\" (UID: \"dd23ade7-a87c-49cc-847b-d8a2102dcf48\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" Apr 23 18:36:20.720467 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.720441 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgdbd\" (UniqueName: \"kubernetes.io/projected/dd23ade7-a87c-49cc-847b-d8a2102dcf48-kube-api-access-vgdbd\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4\" (UID: \"dd23ade7-a87c-49cc-847b-d8a2102dcf48\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" Apr 23 18:36:20.736921 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.736892 2578 generic.go:358] "Generic (PLEG): container finished" podID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerID="07f252eb4543a409dabb49b195bb3fa295ac5a48de2c1a6f188f64d342b63fe4" exitCode=2 Apr 23 18:36:20.737074 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.736942 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" event={"ID":"c7c8f938-8e2c-4a56-bb38-7242d668f527","Type":"ContainerDied","Data":"07f252eb4543a409dabb49b195bb3fa295ac5a48de2c1a6f188f64d342b63fe4"} Apr 23 18:36:20.752144 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.752104 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" Apr 23 18:36:20.883059 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:20.882957 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4"] Apr 23 18:36:20.886954 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:36:20.886923 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd23ade7_a87c_49cc_847b_d8a2102dcf48.slice/crio-c70b6fa3152bd1cb99a1fceefdc59fad26ff59283527868b50396f62c7519c72 WatchSource:0}: Error finding container c70b6fa3152bd1cb99a1fceefdc59fad26ff59283527868b50396f62c7519c72: Status 404 returned error can't find the container with id c70b6fa3152bd1cb99a1fceefdc59fad26ff59283527868b50396f62c7519c72 Apr 23 18:36:21.742118 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:21.742076 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" event={"ID":"dd23ade7-a87c-49cc-847b-d8a2102dcf48","Type":"ContainerStarted","Data":"4574aef62f80f3e30d1d0e4fc3270c42f38ab1eb38ef98f12a25b3fa54aa750d"} Apr 23 18:36:21.742118 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:21.742123 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" event={"ID":"dd23ade7-a87c-49cc-847b-d8a2102dcf48","Type":"ContainerStarted","Data":"c70b6fa3152bd1cb99a1fceefdc59fad26ff59283527868b50396f62c7519c72"} Apr 23 18:36:23.458311 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:23.458258 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" podUID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.38:8643/healthz\": dial tcp 10.133.0.38:8643: connect: connection refused" Apr 23 18:36:23.463768 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:23.463729 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" podUID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 18:36:24.754871 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:24.754828 2578 generic.go:358] "Generic (PLEG): container finished" podID="dd23ade7-a87c-49cc-847b-d8a2102dcf48" containerID="4574aef62f80f3e30d1d0e4fc3270c42f38ab1eb38ef98f12a25b3fa54aa750d" exitCode=0 Apr 23 18:36:24.755269 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:24.754898 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" event={"ID":"dd23ade7-a87c-49cc-847b-d8a2102dcf48","Type":"ContainerDied","Data":"4574aef62f80f3e30d1d0e4fc3270c42f38ab1eb38ef98f12a25b3fa54aa750d"} Apr 23 18:36:25.760751 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:25.760719 2578 generic.go:358] "Generic (PLEG): container finished" podID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerID="79adfa500c03aab30b789a9a182a005a8075ccd9c8ec98a022a8029a49c9b2db" exitCode=0 Apr 23 18:36:25.761160 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:25.760796 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" event={"ID":"c7c8f938-8e2c-4a56-bb38-7242d668f527","Type":"ContainerDied","Data":"79adfa500c03aab30b789a9a182a005a8075ccd9c8ec98a022a8029a49c9b2db"} Apr 23 18:36:25.762637 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:25.762614 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" event={"ID":"dd23ade7-a87c-49cc-847b-d8a2102dcf48","Type":"ContainerStarted","Data":"ca3749aaa438cc0ece522803ce734167d9c2a6d022d6b061093f1039e1b63b5d"} Apr 23 18:36:25.762746 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:25.762643 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" event={"ID":"dd23ade7-a87c-49cc-847b-d8a2102dcf48","Type":"ContainerStarted","Data":"14d30a04a298cc29fc587de9bef7ffb64071146c57b5b8e042e93d41a9d51b88"} Apr 23 18:36:25.762987 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:25.762965 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" Apr 23 18:36:25.763058 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:25.763006 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" Apr 23 18:36:25.785555 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:25.785490 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" podStartSLOduration=5.785474827 podStartE2EDuration="5.785474827s" podCreationTimestamp="2026-04-23 18:36:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:36:25.783928074 +0000 UTC m=+2270.017670000" watchObservedRunningTime="2026-04-23 18:36:25.785474827 +0000 UTC m=+2270.019216746" Apr 23 18:36:26.054973 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:26.054948 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" Apr 23 18:36:26.162698 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:26.162658 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsjkn\" (UniqueName: \"kubernetes.io/projected/c7c8f938-8e2c-4a56-bb38-7242d668f527-kube-api-access-lsjkn\") pod \"c7c8f938-8e2c-4a56-bb38-7242d668f527\" (UID: \"c7c8f938-8e2c-4a56-bb38-7242d668f527\") " Apr 23 18:36:26.162698 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:26.162700 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7c8f938-8e2c-4a56-bb38-7242d668f527-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"c7c8f938-8e2c-4a56-bb38-7242d668f527\" (UID: \"c7c8f938-8e2c-4a56-bb38-7242d668f527\") " Apr 23 18:36:26.162957 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:26.162769 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7c8f938-8e2c-4a56-bb38-7242d668f527-proxy-tls\") pod \"c7c8f938-8e2c-4a56-bb38-7242d668f527\" (UID: \"c7c8f938-8e2c-4a56-bb38-7242d668f527\") " Apr 23 18:36:26.162957 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:26.162805 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7c8f938-8e2c-4a56-bb38-7242d668f527-kserve-provision-location\") pod \"c7c8f938-8e2c-4a56-bb38-7242d668f527\" (UID: \"c7c8f938-8e2c-4a56-bb38-7242d668f527\") " Apr 23 18:36:26.163114 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:26.163081 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7c8f938-8e2c-4a56-bb38-7242d668f527-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config") pod "c7c8f938-8e2c-4a56-bb38-7242d668f527" (UID: "c7c8f938-8e2c-4a56-bb38-7242d668f527"). InnerVolumeSpecName "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:36:26.163186 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:26.163146 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7c8f938-8e2c-4a56-bb38-7242d668f527-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c7c8f938-8e2c-4a56-bb38-7242d668f527" (UID: "c7c8f938-8e2c-4a56-bb38-7242d668f527"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:36:26.164932 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:26.164909 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c8f938-8e2c-4a56-bb38-7242d668f527-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c7c8f938-8e2c-4a56-bb38-7242d668f527" (UID: "c7c8f938-8e2c-4a56-bb38-7242d668f527"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:36:26.164998 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:26.164951 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7c8f938-8e2c-4a56-bb38-7242d668f527-kube-api-access-lsjkn" (OuterVolumeSpecName: "kube-api-access-lsjkn") pod "c7c8f938-8e2c-4a56-bb38-7242d668f527" (UID: "c7c8f938-8e2c-4a56-bb38-7242d668f527"). InnerVolumeSpecName "kube-api-access-lsjkn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:36:26.264178 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:26.264128 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lsjkn\" (UniqueName: \"kubernetes.io/projected/c7c8f938-8e2c-4a56-bb38-7242d668f527-kube-api-access-lsjkn\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:36:26.264178 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:26.264168 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7c8f938-8e2c-4a56-bb38-7242d668f527-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:36:26.264178 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:26.264183 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7c8f938-8e2c-4a56-bb38-7242d668f527-proxy-tls\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:36:26.264421 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:26.264197 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7c8f938-8e2c-4a56-bb38-7242d668f527-kserve-provision-location\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:36:26.768284 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:26.768252 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" Apr 23 18:36:26.768766 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:26.768253 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq" event={"ID":"c7c8f938-8e2c-4a56-bb38-7242d668f527","Type":"ContainerDied","Data":"53d760452ce8eb6da3b95a045b366e46572a7d534a36554315309b180c9c6cd2"} Apr 23 18:36:26.768766 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:26.768377 2578 scope.go:117] "RemoveContainer" containerID="07f252eb4543a409dabb49b195bb3fa295ac5a48de2c1a6f188f64d342b63fe4" Apr 23 18:36:26.776747 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:26.776726 2578 scope.go:117] "RemoveContainer" containerID="79adfa500c03aab30b789a9a182a005a8075ccd9c8ec98a022a8029a49c9b2db" Apr 23 18:36:26.784740 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:26.784716 2578 scope.go:117] "RemoveContainer" containerID="90b297ba0e3dfef948ea1b2ad2b5a452bcb3cbd4b6ba53a4db3e18a86521ce29" Apr 23 18:36:26.787810 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:26.787779 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq"] Apr 23 18:36:26.795220 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:26.795171 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-2btqq"] Apr 23 18:36:28.358160 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:28.358109 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7c8f938-8e2c-4a56-bb38-7242d668f527" path="/var/lib/kubelet/pods/c7c8f938-8e2c-4a56-bb38-7242d668f527/volumes" Apr 23 18:36:31.774354 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:36:31.774312 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" Apr 23 18:37:01.775105 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:01.775061 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" podUID="dd23ade7-a87c-49cc-847b-d8a2102dcf48" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.39:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 18:37:11.775757 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:11.775708 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" podUID="dd23ade7-a87c-49cc-847b-d8a2102dcf48" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.39:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 18:37:21.775245 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:21.775198 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" podUID="dd23ade7-a87c-49cc-847b-d8a2102dcf48" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.39:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 18:37:31.775350 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:31.775253 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" podUID="dd23ade7-a87c-49cc-847b-d8a2102dcf48" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.39:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 18:37:38.354372 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:38.354327 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" podUID="dd23ade7-a87c-49cc-847b-d8a2102dcf48" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.39:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 18:37:48.358246 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:48.358216 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" Apr 23 18:37:50.528051 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.528009 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4"] Apr 23 18:37:50.528548 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.528356 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" podUID="dd23ade7-a87c-49cc-847b-d8a2102dcf48" containerName="kserve-container" containerID="cri-o://14d30a04a298cc29fc587de9bef7ffb64071146c57b5b8e042e93d41a9d51b88" gracePeriod=30 Apr 23 18:37:50.528548 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.528398 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" podUID="dd23ade7-a87c-49cc-847b-d8a2102dcf48" containerName="kube-rbac-proxy" containerID="cri-o://ca3749aaa438cc0ece522803ce734167d9c2a6d022d6b061093f1039e1b63b5d" gracePeriod=30 Apr 23 18:37:50.629124 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.629084 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd"] Apr 23 18:37:50.629458 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.629444 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerName="storage-initializer" Apr 23 18:37:50.629458 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.629459 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerName="storage-initializer" Apr 23 18:37:50.629569 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.629467 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerName="kserve-container" Apr 23 18:37:50.629569 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.629473 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerName="kserve-container" Apr 23 18:37:50.629569 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.629486 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerName="kube-rbac-proxy" Apr 23 18:37:50.629569 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.629492 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerName="kube-rbac-proxy" Apr 23 18:37:50.629569 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.629558 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerName="kube-rbac-proxy" Apr 23 18:37:50.629569 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.629568 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7c8f938-8e2c-4a56-bb38-7242d668f527" containerName="kserve-container" Apr 23 18:37:50.633055 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.633025 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" Apr 23 18:37:50.635006 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.634984 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-predictor-serving-cert\"" Apr 23 18:37:50.635140 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.634990 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 23 18:37:50.648715 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.645857 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd"] Apr 23 18:37:50.766512 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.766453 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd\" (UID: \"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" Apr 23 18:37:50.766512 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.766514 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd\" (UID: \"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" Apr 23 18:37:50.766781 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.766619 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd\" (UID: \"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" Apr 23 18:37:50.766781 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.766695 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl2xv\" (UniqueName: \"kubernetes.io/projected/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-kube-api-access-sl2xv\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd\" (UID: \"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" Apr 23 18:37:50.867645 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.867510 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd\" (UID: \"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" Apr 23 18:37:50.867645 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.867593 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd\" (UID: \"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" Apr 23 18:37:50.867645 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.867629 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd\" (UID: \"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" Apr 23 18:37:50.867947 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:37:50.867645 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-serving-cert: secret "isvc-predictive-xgboost-v2-predictor-serving-cert" not found Apr 23 18:37:50.867947 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.867683 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sl2xv\" (UniqueName: \"kubernetes.io/projected/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-kube-api-access-sl2xv\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd\" (UID: \"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" Apr 23 18:37:50.867947 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:37:50.867713 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-proxy-tls podName:fb1ffc02-5e07-4960-8e15-a7fb96a48ec7 nodeName:}" failed. No retries permitted until 2026-04-23 18:37:51.367696791 +0000 UTC m=+2355.601438686 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-proxy-tls") pod "isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" (UID: "fb1ffc02-5e07-4960-8e15-a7fb96a48ec7") : secret "isvc-predictive-xgboost-v2-predictor-serving-cert" not found Apr 23 18:37:50.868138 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.868115 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd\" (UID: \"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" Apr 23 18:37:50.868359 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.868341 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd\" (UID: \"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" Apr 23 18:37:50.876134 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:50.876096 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl2xv\" (UniqueName: \"kubernetes.io/projected/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-kube-api-access-sl2xv\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd\" (UID: \"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" Apr 23 18:37:51.049606 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:51.049569 2578 generic.go:358] "Generic (PLEG): container finished" podID="dd23ade7-a87c-49cc-847b-d8a2102dcf48" containerID="ca3749aaa438cc0ece522803ce734167d9c2a6d022d6b061093f1039e1b63b5d" exitCode=2 Apr 23 18:37:51.049784 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:51.049622 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" event={"ID":"dd23ade7-a87c-49cc-847b-d8a2102dcf48","Type":"ContainerDied","Data":"ca3749aaa438cc0ece522803ce734167d9c2a6d022d6b061093f1039e1b63b5d"} Apr 23 18:37:51.371450 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:51.371397 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd\" (UID: \"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" Apr 23 18:37:51.374111 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:51.374077 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd\" (UID: \"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" Apr 23 18:37:51.547041 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:51.546990 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" Apr 23 18:37:51.677352 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:51.677185 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd"] Apr 23 18:37:51.680306 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:37:51.680271 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb1ffc02_5e07_4960_8e15_a7fb96a48ec7.slice/crio-b42d68923ba41338bf8fb9b71130dbf232c3326128c224c18245702be74c313a WatchSource:0}: Error finding container b42d68923ba41338bf8fb9b71130dbf232c3326128c224c18245702be74c313a: Status 404 returned error can't find the container with id b42d68923ba41338bf8fb9b71130dbf232c3326128c224c18245702be74c313a Apr 23 18:37:51.768947 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:51.768895 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" podUID="dd23ade7-a87c-49cc-847b-d8a2102dcf48" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.39:8643/healthz\": dial tcp 10.133.0.39:8643: connect: connection refused" Apr 23 18:37:52.058812 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:52.058772 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" event={"ID":"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7","Type":"ContainerStarted","Data":"312af134e7071340344bbb7170ddddec9c97b95cef0ea40cd4226976ba2a3165"} Apr 23 18:37:52.058812 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:52.058825 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" event={"ID":"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7","Type":"ContainerStarted","Data":"b42d68923ba41338bf8fb9b71130dbf232c3326128c224c18245702be74c313a"} Apr 23 18:37:55.972961 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:55.972935 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" Apr 23 18:37:56.073984 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.073893 2578 generic.go:358] "Generic (PLEG): container finished" podID="fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" containerID="312af134e7071340344bbb7170ddddec9c97b95cef0ea40cd4226976ba2a3165" exitCode=0 Apr 23 18:37:56.073984 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.073969 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" event={"ID":"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7","Type":"ContainerDied","Data":"312af134e7071340344bbb7170ddddec9c97b95cef0ea40cd4226976ba2a3165"} Apr 23 18:37:56.075906 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.075878 2578 generic.go:358] "Generic (PLEG): container finished" podID="dd23ade7-a87c-49cc-847b-d8a2102dcf48" containerID="14d30a04a298cc29fc587de9bef7ffb64071146c57b5b8e042e93d41a9d51b88" exitCode=0 Apr 23 18:37:56.076047 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.075955 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" Apr 23 18:37:56.076047 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.075967 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" event={"ID":"dd23ade7-a87c-49cc-847b-d8a2102dcf48","Type":"ContainerDied","Data":"14d30a04a298cc29fc587de9bef7ffb64071146c57b5b8e042e93d41a9d51b88"} Apr 23 18:37:56.076047 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.076007 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4" event={"ID":"dd23ade7-a87c-49cc-847b-d8a2102dcf48","Type":"ContainerDied","Data":"c70b6fa3152bd1cb99a1fceefdc59fad26ff59283527868b50396f62c7519c72"} Apr 23 18:37:56.076047 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.076024 2578 scope.go:117] "RemoveContainer" containerID="ca3749aaa438cc0ece522803ce734167d9c2a6d022d6b061093f1039e1b63b5d" Apr 23 18:37:56.084427 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.084402 2578 scope.go:117] "RemoveContainer" containerID="14d30a04a298cc29fc587de9bef7ffb64071146c57b5b8e042e93d41a9d51b88" Apr 23 18:37:56.093842 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.093814 2578 scope.go:117] "RemoveContainer" containerID="4574aef62f80f3e30d1d0e4fc3270c42f38ab1eb38ef98f12a25b3fa54aa750d" Apr 23 18:37:56.107148 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.107123 2578 scope.go:117] "RemoveContainer" containerID="ca3749aaa438cc0ece522803ce734167d9c2a6d022d6b061093f1039e1b63b5d" Apr 23 18:37:56.107463 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:37:56.107442 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca3749aaa438cc0ece522803ce734167d9c2a6d022d6b061093f1039e1b63b5d\": container with ID starting with ca3749aaa438cc0ece522803ce734167d9c2a6d022d6b061093f1039e1b63b5d not found: ID does not exist" containerID="ca3749aaa438cc0ece522803ce734167d9c2a6d022d6b061093f1039e1b63b5d" Apr 23 18:37:56.107572 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.107471 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca3749aaa438cc0ece522803ce734167d9c2a6d022d6b061093f1039e1b63b5d"} err="failed to get container status \"ca3749aaa438cc0ece522803ce734167d9c2a6d022d6b061093f1039e1b63b5d\": rpc error: code = NotFound desc = could not find container \"ca3749aaa438cc0ece522803ce734167d9c2a6d022d6b061093f1039e1b63b5d\": container with ID starting with ca3749aaa438cc0ece522803ce734167d9c2a6d022d6b061093f1039e1b63b5d not found: ID does not exist" Apr 23 18:37:56.107572 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.107494 2578 scope.go:117] "RemoveContainer" containerID="14d30a04a298cc29fc587de9bef7ffb64071146c57b5b8e042e93d41a9d51b88" Apr 23 18:37:56.107874 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:37:56.107849 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14d30a04a298cc29fc587de9bef7ffb64071146c57b5b8e042e93d41a9d51b88\": container with ID starting with 14d30a04a298cc29fc587de9bef7ffb64071146c57b5b8e042e93d41a9d51b88 not found: ID does not exist" containerID="14d30a04a298cc29fc587de9bef7ffb64071146c57b5b8e042e93d41a9d51b88" Apr 23 18:37:56.107978 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.107878 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14d30a04a298cc29fc587de9bef7ffb64071146c57b5b8e042e93d41a9d51b88"} err="failed to get container status \"14d30a04a298cc29fc587de9bef7ffb64071146c57b5b8e042e93d41a9d51b88\": rpc error: code = NotFound desc = could not find container \"14d30a04a298cc29fc587de9bef7ffb64071146c57b5b8e042e93d41a9d51b88\": container with ID starting with 14d30a04a298cc29fc587de9bef7ffb64071146c57b5b8e042e93d41a9d51b88 not found: ID does not exist" Apr 23 18:37:56.107978 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.107896 2578 scope.go:117] "RemoveContainer" containerID="4574aef62f80f3e30d1d0e4fc3270c42f38ab1eb38ef98f12a25b3fa54aa750d" Apr 23 18:37:56.108155 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:37:56.108132 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4574aef62f80f3e30d1d0e4fc3270c42f38ab1eb38ef98f12a25b3fa54aa750d\": container with ID starting with 4574aef62f80f3e30d1d0e4fc3270c42f38ab1eb38ef98f12a25b3fa54aa750d not found: ID does not exist" containerID="4574aef62f80f3e30d1d0e4fc3270c42f38ab1eb38ef98f12a25b3fa54aa750d" Apr 23 18:37:56.108197 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.108168 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4574aef62f80f3e30d1d0e4fc3270c42f38ab1eb38ef98f12a25b3fa54aa750d"} err="failed to get container status \"4574aef62f80f3e30d1d0e4fc3270c42f38ab1eb38ef98f12a25b3fa54aa750d\": rpc error: code = NotFound desc = could not find container \"4574aef62f80f3e30d1d0e4fc3270c42f38ab1eb38ef98f12a25b3fa54aa750d\": container with ID starting with 4574aef62f80f3e30d1d0e4fc3270c42f38ab1eb38ef98f12a25b3fa54aa750d not found: ID does not exist" Apr 23 18:37:56.114614 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.114528 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd23ade7-a87c-49cc-847b-d8a2102dcf48-kserve-provision-location\") pod \"dd23ade7-a87c-49cc-847b-d8a2102dcf48\" (UID: \"dd23ade7-a87c-49cc-847b-d8a2102dcf48\") " Apr 23 18:37:56.114795 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.114638 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgdbd\" (UniqueName: \"kubernetes.io/projected/dd23ade7-a87c-49cc-847b-d8a2102dcf48-kube-api-access-vgdbd\") pod \"dd23ade7-a87c-49cc-847b-d8a2102dcf48\" (UID: \"dd23ade7-a87c-49cc-847b-d8a2102dcf48\") " Apr 23 18:37:56.114795 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.114680 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd23ade7-a87c-49cc-847b-d8a2102dcf48-proxy-tls\") pod \"dd23ade7-a87c-49cc-847b-d8a2102dcf48\" (UID: \"dd23ade7-a87c-49cc-847b-d8a2102dcf48\") " Apr 23 18:37:56.114795 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.114744 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dd23ade7-a87c-49cc-847b-d8a2102dcf48-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"dd23ade7-a87c-49cc-847b-d8a2102dcf48\" (UID: \"dd23ade7-a87c-49cc-847b-d8a2102dcf48\") " Apr 23 18:37:56.115291 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.114968 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd23ade7-a87c-49cc-847b-d8a2102dcf48-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dd23ade7-a87c-49cc-847b-d8a2102dcf48" (UID: "dd23ade7-a87c-49cc-847b-d8a2102dcf48"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:37:56.115479 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.115339 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd23ade7-a87c-49cc-847b-d8a2102dcf48-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config") pod "dd23ade7-a87c-49cc-847b-d8a2102dcf48" (UID: "dd23ade7-a87c-49cc-847b-d8a2102dcf48"). InnerVolumeSpecName "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:37:56.117314 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.117284 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd23ade7-a87c-49cc-847b-d8a2102dcf48-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "dd23ade7-a87c-49cc-847b-d8a2102dcf48" (UID: "dd23ade7-a87c-49cc-847b-d8a2102dcf48"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:37:56.117404 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.117315 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd23ade7-a87c-49cc-847b-d8a2102dcf48-kube-api-access-vgdbd" (OuterVolumeSpecName: "kube-api-access-vgdbd") pod "dd23ade7-a87c-49cc-847b-d8a2102dcf48" (UID: "dd23ade7-a87c-49cc-847b-d8a2102dcf48"). InnerVolumeSpecName "kube-api-access-vgdbd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:37:56.215869 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.215828 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vgdbd\" (UniqueName: \"kubernetes.io/projected/dd23ade7-a87c-49cc-847b-d8a2102dcf48-kube-api-access-vgdbd\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:37:56.215869 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.215870 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd23ade7-a87c-49cc-847b-d8a2102dcf48-proxy-tls\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:37:56.216128 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.215886 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dd23ade7-a87c-49cc-847b-d8a2102dcf48-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:37:56.216128 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.215901 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd23ade7-a87c-49cc-847b-d8a2102dcf48-kserve-provision-location\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:37:56.392396 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.392346 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4"] Apr 23 18:37:56.395645 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:56.395608 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-xrqm4"] Apr 23 18:37:57.082521 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:57.082482 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" event={"ID":"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7","Type":"ContainerStarted","Data":"775225cfb2f9c28dcc7486cce49c0f7362eea548a48993564eaac5b92ea3eb64"} Apr 23 18:37:57.082932 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:57.082557 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" event={"ID":"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7","Type":"ContainerStarted","Data":"0251db5b8495a7754eb591a7b083c168fd3075ae788632a7172b009cd8617ee5"} Apr 23 18:37:57.082932 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:57.082784 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" Apr 23 18:37:57.082932 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:57.082903 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" Apr 23 18:37:57.102520 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:57.102460 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" podStartSLOduration=7.102436819 podStartE2EDuration="7.102436819s" podCreationTimestamp="2026-04-23 18:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:37:57.101440875 +0000 UTC m=+2361.335182790" watchObservedRunningTime="2026-04-23 18:37:57.102436819 +0000 UTC m=+2361.336178736" Apr 23 18:37:58.358389 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:37:58.358350 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd23ade7-a87c-49cc-847b-d8a2102dcf48" path="/var/lib/kubelet/pods/dd23ade7-a87c-49cc-847b-d8a2102dcf48/volumes" Apr 23 18:38:03.093082 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:38:03.093051 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" Apr 23 18:38:33.094199 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:38:33.094151 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" podUID="fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.40:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 18:38:36.392951 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:38:36.392921 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:38:36.399285 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:38:36.399261 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:38:36.399469 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:38:36.399452 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:38:36.405675 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:38:36.405653 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:38:43.094130 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:38:43.094090 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" podUID="fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.40:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 18:38:53.093618 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:38:53.093576 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" podUID="fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.40:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 18:39:03.093595 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:03.093500 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" podUID="fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.40:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 18:39:04.354009 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:04.353967 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" podUID="fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.40:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 18:39:14.357892 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:14.357862 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" Apr 23 18:39:20.743493 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:20.743456 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd"] Apr 23 18:39:20.743998 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:20.743843 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" podUID="fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" containerName="kserve-container" containerID="cri-o://0251db5b8495a7754eb591a7b083c168fd3075ae788632a7172b009cd8617ee5" gracePeriod=30 Apr 23 18:39:20.744073 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:20.743995 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" podUID="fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" containerName="kube-rbac-proxy" containerID="cri-o://775225cfb2f9c28dcc7486cce49c0f7362eea548a48993564eaac5b92ea3eb64" gracePeriod=30 Apr 23 18:39:20.865284 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:20.865252 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km"] Apr 23 18:39:20.865631 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:20.865616 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd23ade7-a87c-49cc-847b-d8a2102dcf48" containerName="storage-initializer" Apr 23 18:39:20.865631 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:20.865631 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd23ade7-a87c-49cc-847b-d8a2102dcf48" containerName="storage-initializer" Apr 23 18:39:20.865742 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:20.865644 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd23ade7-a87c-49cc-847b-d8a2102dcf48" containerName="kube-rbac-proxy" Apr 23 18:39:20.865742 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:20.865650 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd23ade7-a87c-49cc-847b-d8a2102dcf48" containerName="kube-rbac-proxy" Apr 23 18:39:20.865742 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:20.865661 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd23ade7-a87c-49cc-847b-d8a2102dcf48" containerName="kserve-container" Apr 23 18:39:20.865742 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:20.865667 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd23ade7-a87c-49cc-847b-d8a2102dcf48" containerName="kserve-container" Apr 23 18:39:20.865742 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:20.865720 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd23ade7-a87c-49cc-847b-d8a2102dcf48" containerName="kube-rbac-proxy" Apr 23 18:39:20.865742 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:20.865728 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd23ade7-a87c-49cc-847b-d8a2102dcf48" containerName="kserve-container" Apr 23 18:39:20.868695 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:20.868674 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" Apr 23 18:39:20.870470 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:20.870446 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\"" Apr 23 18:39:20.870564 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:20.870509 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-predictor-serving-cert\"" Apr 23 18:39:20.877421 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:20.877388 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km"] Apr 23 18:39:21.024672 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:21.024622 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0536c0dc-d064-4425-a56d-8b149f134864-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km\" (UID: \"0536c0dc-d064-4425-a56d-8b149f134864\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" Apr 23 18:39:21.024672 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:21.024681 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0536c0dc-d064-4425-a56d-8b149f134864-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km\" (UID: \"0536c0dc-d064-4425-a56d-8b149f134864\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" Apr 23 18:39:21.024928 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:21.024740 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b97z2\" (UniqueName: \"kubernetes.io/projected/0536c0dc-d064-4425-a56d-8b149f134864-kube-api-access-b97z2\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km\" (UID: \"0536c0dc-d064-4425-a56d-8b149f134864\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" Apr 23 18:39:21.024928 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:21.024760 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0536c0dc-d064-4425-a56d-8b149f134864-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km\" (UID: \"0536c0dc-d064-4425-a56d-8b149f134864\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" Apr 23 18:39:21.126179 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:21.126119 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0536c0dc-d064-4425-a56d-8b149f134864-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km\" (UID: \"0536c0dc-d064-4425-a56d-8b149f134864\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" Apr 23 18:39:21.126179 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:21.126192 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0536c0dc-d064-4425-a56d-8b149f134864-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km\" (UID: \"0536c0dc-d064-4425-a56d-8b149f134864\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" Apr 23 18:39:21.126410 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:21.126241 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b97z2\" (UniqueName: \"kubernetes.io/projected/0536c0dc-d064-4425-a56d-8b149f134864-kube-api-access-b97z2\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km\" (UID: \"0536c0dc-d064-4425-a56d-8b149f134864\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" Apr 23 18:39:21.126410 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:21.126269 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0536c0dc-d064-4425-a56d-8b149f134864-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km\" (UID: \"0536c0dc-d064-4425-a56d-8b149f134864\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" Apr 23 18:39:21.126642 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:21.126623 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0536c0dc-d064-4425-a56d-8b149f134864-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km\" (UID: \"0536c0dc-d064-4425-a56d-8b149f134864\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" Apr 23 18:39:21.127058 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:21.127034 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0536c0dc-d064-4425-a56d-8b149f134864-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km\" (UID: \"0536c0dc-d064-4425-a56d-8b149f134864\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" Apr 23 18:39:21.128968 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:21.128948 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0536c0dc-d064-4425-a56d-8b149f134864-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km\" (UID: \"0536c0dc-d064-4425-a56d-8b149f134864\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" Apr 23 18:39:21.134515 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:21.134493 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b97z2\" (UniqueName: \"kubernetes.io/projected/0536c0dc-d064-4425-a56d-8b149f134864-kube-api-access-b97z2\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km\" (UID: \"0536c0dc-d064-4425-a56d-8b149f134864\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" Apr 23 18:39:21.179700 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:21.179657 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" Apr 23 18:39:21.312035 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:21.311954 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km"] Apr 23 18:39:21.315343 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:39:21.315315 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0536c0dc_d064_4425_a56d_8b149f134864.slice/crio-00fd1581e01be85c62f60942ba141e0001c72bea0e6f59ab36c6dc7d95d0b946 WatchSource:0}: Error finding container 00fd1581e01be85c62f60942ba141e0001c72bea0e6f59ab36c6dc7d95d0b946: Status 404 returned error can't find the container with id 00fd1581e01be85c62f60942ba141e0001c72bea0e6f59ab36c6dc7d95d0b946 Apr 23 18:39:21.317096 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:21.317079 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:39:21.375142 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:21.375105 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" event={"ID":"0536c0dc-d064-4425-a56d-8b149f134864","Type":"ContainerStarted","Data":"fae7481f7761265b53a0488d9a8ea70069a54351a450ded3d641efbc6600c846"} Apr 23 18:39:21.375142 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:21.375145 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" event={"ID":"0536c0dc-d064-4425-a56d-8b149f134864","Type":"ContainerStarted","Data":"00fd1581e01be85c62f60942ba141e0001c72bea0e6f59ab36c6dc7d95d0b946"} Apr 23 18:39:21.377096 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:21.377067 2578 generic.go:358] "Generic (PLEG): container finished" podID="fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" containerID="775225cfb2f9c28dcc7486cce49c0f7362eea548a48993564eaac5b92ea3eb64" exitCode=2 Apr 23 18:39:21.377232 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:21.377111 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" event={"ID":"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7","Type":"ContainerDied","Data":"775225cfb2f9c28dcc7486cce49c0f7362eea548a48993564eaac5b92ea3eb64"} Apr 23 18:39:23.088345 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:23.088297 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" podUID="fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.40:8643/healthz\": dial tcp 10.133.0.40:8643: connect: connection refused" Apr 23 18:39:24.354153 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:24.354105 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" podUID="fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.40:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 18:39:25.390719 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:25.390625 2578 generic.go:358] "Generic (PLEG): container finished" podID="0536c0dc-d064-4425-a56d-8b149f134864" containerID="fae7481f7761265b53a0488d9a8ea70069a54351a450ded3d641efbc6600c846" exitCode=0 Apr 23 18:39:25.390719 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:25.390685 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" event={"ID":"0536c0dc-d064-4425-a56d-8b149f134864","Type":"ContainerDied","Data":"fae7481f7761265b53a0488d9a8ea70069a54351a450ded3d641efbc6600c846"} Apr 23 18:39:25.978959 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:25.978934 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" Apr 23 18:39:26.168851 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.168752 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl2xv\" (UniqueName: \"kubernetes.io/projected/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-kube-api-access-sl2xv\") pod \"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7\" (UID: \"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7\") " Apr 23 18:39:26.168851 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.168845 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-proxy-tls\") pod \"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7\" (UID: \"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7\") " Apr 23 18:39:26.169052 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.169019 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7\" (UID: \"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7\") " Apr 23 18:39:26.169102 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.169072 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-kserve-provision-location\") pod \"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7\" (UID: \"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7\") " Apr 23 18:39:26.169375 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.169342 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config") pod "fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" (UID: "fb1ffc02-5e07-4960-8e15-a7fb96a48ec7"). InnerVolumeSpecName "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:39:26.169375 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.169362 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" (UID: "fb1ffc02-5e07-4960-8e15-a7fb96a48ec7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:39:26.171486 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.171448 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" (UID: "fb1ffc02-5e07-4960-8e15-a7fb96a48ec7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:39:26.171486 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.171469 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-kube-api-access-sl2xv" (OuterVolumeSpecName: "kube-api-access-sl2xv") pod "fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" (UID: "fb1ffc02-5e07-4960-8e15-a7fb96a48ec7"). InnerVolumeSpecName "kube-api-access-sl2xv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:39:26.269707 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.269671 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sl2xv\" (UniqueName: \"kubernetes.io/projected/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-kube-api-access-sl2xv\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:39:26.269707 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.269704 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-proxy-tls\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:39:26.269707 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.269715 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:39:26.269951 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.269726 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7-kserve-provision-location\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:39:26.395730 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.395694 2578 generic.go:358] "Generic (PLEG): container finished" podID="fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" containerID="0251db5b8495a7754eb591a7b083c168fd3075ae788632a7172b009cd8617ee5" exitCode=0 Apr 23 18:39:26.396175 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.395770 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" event={"ID":"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7","Type":"ContainerDied","Data":"0251db5b8495a7754eb591a7b083c168fd3075ae788632a7172b009cd8617ee5"} Apr 23 18:39:26.396175 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.395806 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" event={"ID":"fb1ffc02-5e07-4960-8e15-a7fb96a48ec7","Type":"ContainerDied","Data":"b42d68923ba41338bf8fb9b71130dbf232c3326128c224c18245702be74c313a"} Apr 23 18:39:26.396175 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.395782 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd" Apr 23 18:39:26.396175 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.395845 2578 scope.go:117] "RemoveContainer" containerID="775225cfb2f9c28dcc7486cce49c0f7362eea548a48993564eaac5b92ea3eb64" Apr 23 18:39:26.397978 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.397948 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" event={"ID":"0536c0dc-d064-4425-a56d-8b149f134864","Type":"ContainerStarted","Data":"893b502d045695a58958a96b99d59e3221f7e7a8c86041f40600e61fdf82acc8"} Apr 23 18:39:26.398103 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.397989 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" event={"ID":"0536c0dc-d064-4425-a56d-8b149f134864","Type":"ContainerStarted","Data":"252047bcd375c5968b8be25c313ce860fda3583908110245c348375389778e47"} Apr 23 18:39:26.398347 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.398257 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" Apr 23 18:39:26.398347 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.398292 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" Apr 23 18:39:26.404614 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.404589 2578 scope.go:117] "RemoveContainer" containerID="0251db5b8495a7754eb591a7b083c168fd3075ae788632a7172b009cd8617ee5" Apr 23 18:39:26.414060 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.414031 2578 scope.go:117] "RemoveContainer" containerID="312af134e7071340344bbb7170ddddec9c97b95cef0ea40cd4226976ba2a3165" Apr 23 18:39:26.414636 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.414603 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd"] Apr 23 18:39:26.416334 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.416306 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-khcxd"] Apr 23 18:39:26.422883 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.422591 2578 scope.go:117] "RemoveContainer" containerID="775225cfb2f9c28dcc7486cce49c0f7362eea548a48993564eaac5b92ea3eb64" Apr 23 18:39:26.425103 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:39:26.425068 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"775225cfb2f9c28dcc7486cce49c0f7362eea548a48993564eaac5b92ea3eb64\": container with ID starting with 775225cfb2f9c28dcc7486cce49c0f7362eea548a48993564eaac5b92ea3eb64 not found: ID does not exist" containerID="775225cfb2f9c28dcc7486cce49c0f7362eea548a48993564eaac5b92ea3eb64" Apr 23 18:39:26.425247 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.425115 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"775225cfb2f9c28dcc7486cce49c0f7362eea548a48993564eaac5b92ea3eb64"} err="failed to get container status \"775225cfb2f9c28dcc7486cce49c0f7362eea548a48993564eaac5b92ea3eb64\": rpc error: code = NotFound desc = could not find container \"775225cfb2f9c28dcc7486cce49c0f7362eea548a48993564eaac5b92ea3eb64\": container with ID starting with 775225cfb2f9c28dcc7486cce49c0f7362eea548a48993564eaac5b92ea3eb64 not found: ID does not exist" Apr 23 18:39:26.425247 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.425144 2578 scope.go:117] "RemoveContainer" containerID="0251db5b8495a7754eb591a7b083c168fd3075ae788632a7172b009cd8617ee5" Apr 23 18:39:26.425499 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:39:26.425474 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0251db5b8495a7754eb591a7b083c168fd3075ae788632a7172b009cd8617ee5\": container with ID starting with 0251db5b8495a7754eb591a7b083c168fd3075ae788632a7172b009cd8617ee5 not found: ID does not exist" containerID="0251db5b8495a7754eb591a7b083c168fd3075ae788632a7172b009cd8617ee5" Apr 23 18:39:26.425587 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.425506 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0251db5b8495a7754eb591a7b083c168fd3075ae788632a7172b009cd8617ee5"} err="failed to get container status \"0251db5b8495a7754eb591a7b083c168fd3075ae788632a7172b009cd8617ee5\": rpc error: code = NotFound desc = could not find container \"0251db5b8495a7754eb591a7b083c168fd3075ae788632a7172b009cd8617ee5\": container with ID starting with 0251db5b8495a7754eb591a7b083c168fd3075ae788632a7172b009cd8617ee5 not found: ID does not exist" Apr 23 18:39:26.425587 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.425525 2578 scope.go:117] "RemoveContainer" containerID="312af134e7071340344bbb7170ddddec9c97b95cef0ea40cd4226976ba2a3165" Apr 23 18:39:26.425824 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:39:26.425804 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"312af134e7071340344bbb7170ddddec9c97b95cef0ea40cd4226976ba2a3165\": container with ID starting with 312af134e7071340344bbb7170ddddec9c97b95cef0ea40cd4226976ba2a3165 not found: ID does not exist" containerID="312af134e7071340344bbb7170ddddec9c97b95cef0ea40cd4226976ba2a3165" Apr 23 18:39:26.425865 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.425830 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"312af134e7071340344bbb7170ddddec9c97b95cef0ea40cd4226976ba2a3165"} err="failed to get container status \"312af134e7071340344bbb7170ddddec9c97b95cef0ea40cd4226976ba2a3165\": rpc error: code = NotFound desc = could not find container \"312af134e7071340344bbb7170ddddec9c97b95cef0ea40cd4226976ba2a3165\": container with ID starting with 312af134e7071340344bbb7170ddddec9c97b95cef0ea40cd4226976ba2a3165 not found: ID does not exist" Apr 23 18:39:26.430158 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:26.430113 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" podStartSLOduration=6.430095643 podStartE2EDuration="6.430095643s" podCreationTimestamp="2026-04-23 18:39:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:39:26.428743562 +0000 UTC m=+2450.662485477" watchObservedRunningTime="2026-04-23 18:39:26.430095643 +0000 UTC m=+2450.663837559" Apr 23 18:39:28.358419 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:28.358381 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" path="/var/lib/kubelet/pods/fb1ffc02-5e07-4960-8e15-a7fb96a48ec7/volumes" Apr 23 18:39:32.407907 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:39:32.407871 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" Apr 23 18:40:02.408586 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:02.408516 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" podUID="0536c0dc-d064-4425-a56d-8b149f134864" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.41:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 18:40:12.409082 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:12.409041 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" podUID="0536c0dc-d064-4425-a56d-8b149f134864" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.41:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 18:40:22.409016 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:22.408975 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" podUID="0536c0dc-d064-4425-a56d-8b149f134864" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.41:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 18:40:32.408836 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:32.408790 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" podUID="0536c0dc-d064-4425-a56d-8b149f134864" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.41:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 18:40:42.412628 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:42.412598 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" Apr 23 18:40:50.975231 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:50.975186 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km"] Apr 23 18:40:50.976237 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:50.976175 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" podUID="0536c0dc-d064-4425-a56d-8b149f134864" containerName="kserve-container" containerID="cri-o://252047bcd375c5968b8be25c313ce860fda3583908110245c348375389778e47" gracePeriod=30 Apr 23 18:40:50.976460 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:50.976229 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" podUID="0536c0dc-d064-4425-a56d-8b149f134864" containerName="kube-rbac-proxy" containerID="cri-o://893b502d045695a58958a96b99d59e3221f7e7a8c86041f40600e61fdf82acc8" gracePeriod=30 Apr 23 18:40:51.680514 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:51.680478 2578 generic.go:358] "Generic (PLEG): container finished" podID="0536c0dc-d064-4425-a56d-8b149f134864" containerID="893b502d045695a58958a96b99d59e3221f7e7a8c86041f40600e61fdf82acc8" exitCode=2 Apr 23 18:40:51.680708 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:51.680564 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" event={"ID":"0536c0dc-d064-4425-a56d-8b149f134864","Type":"ContainerDied","Data":"893b502d045695a58958a96b99d59e3221f7e7a8c86041f40600e61fdf82acc8"} Apr 23 18:40:52.403305 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:52.403255 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" podUID="0536c0dc-d064-4425-a56d-8b149f134864" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.41:8643/healthz\": dial tcp 10.133.0.41:8643: connect: connection refused" Apr 23 18:40:52.409279 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:52.409238 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" podUID="0536c0dc-d064-4425-a56d-8b149f134864" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.41:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 18:40:56.516658 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.516626 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" Apr 23 18:40:56.597585 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.597529 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b97z2\" (UniqueName: \"kubernetes.io/projected/0536c0dc-d064-4425-a56d-8b149f134864-kube-api-access-b97z2\") pod \"0536c0dc-d064-4425-a56d-8b149f134864\" (UID: \"0536c0dc-d064-4425-a56d-8b149f134864\") " Apr 23 18:40:56.597825 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.597612 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0536c0dc-d064-4425-a56d-8b149f134864-kserve-provision-location\") pod \"0536c0dc-d064-4425-a56d-8b149f134864\" (UID: \"0536c0dc-d064-4425-a56d-8b149f134864\") " Apr 23 18:40:56.597825 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.597685 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0536c0dc-d064-4425-a56d-8b149f134864-proxy-tls\") pod \"0536c0dc-d064-4425-a56d-8b149f134864\" (UID: \"0536c0dc-d064-4425-a56d-8b149f134864\") " Apr 23 18:40:56.597825 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.597710 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0536c0dc-d064-4425-a56d-8b149f134864-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"0536c0dc-d064-4425-a56d-8b149f134864\" (UID: \"0536c0dc-d064-4425-a56d-8b149f134864\") " Apr 23 18:40:56.598029 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.597898 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0536c0dc-d064-4425-a56d-8b149f134864-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0536c0dc-d064-4425-a56d-8b149f134864" (UID: "0536c0dc-d064-4425-a56d-8b149f134864"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:40:56.598085 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.598065 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0536c0dc-d064-4425-a56d-8b149f134864-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config") pod "0536c0dc-d064-4425-a56d-8b149f134864" (UID: "0536c0dc-d064-4425-a56d-8b149f134864"). InnerVolumeSpecName "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:40:56.599811 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.599777 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0536c0dc-d064-4425-a56d-8b149f134864-kube-api-access-b97z2" (OuterVolumeSpecName: "kube-api-access-b97z2") pod "0536c0dc-d064-4425-a56d-8b149f134864" (UID: "0536c0dc-d064-4425-a56d-8b149f134864"). InnerVolumeSpecName "kube-api-access-b97z2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:40:56.599936 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.599885 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0536c0dc-d064-4425-a56d-8b149f134864-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0536c0dc-d064-4425-a56d-8b149f134864" (UID: "0536c0dc-d064-4425-a56d-8b149f134864"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:40:56.698935 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.698842 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0536c0dc-d064-4425-a56d-8b149f134864-kserve-provision-location\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:40:56.698935 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.698870 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0536c0dc-d064-4425-a56d-8b149f134864-proxy-tls\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:40:56.698935 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.698882 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0536c0dc-d064-4425-a56d-8b149f134864-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:40:56.698935 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.698897 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b97z2\" (UniqueName: \"kubernetes.io/projected/0536c0dc-d064-4425-a56d-8b149f134864-kube-api-access-b97z2\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:40:56.700571 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.700519 2578 generic.go:358] "Generic (PLEG): container finished" podID="0536c0dc-d064-4425-a56d-8b149f134864" containerID="252047bcd375c5968b8be25c313ce860fda3583908110245c348375389778e47" exitCode=0 Apr 23 18:40:56.700720 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.700596 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" event={"ID":"0536c0dc-d064-4425-a56d-8b149f134864","Type":"ContainerDied","Data":"252047bcd375c5968b8be25c313ce860fda3583908110245c348375389778e47"} Apr 23 18:40:56.700720 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.700619 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" Apr 23 18:40:56.700720 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.700630 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km" event={"ID":"0536c0dc-d064-4425-a56d-8b149f134864","Type":"ContainerDied","Data":"00fd1581e01be85c62f60942ba141e0001c72bea0e6f59ab36c6dc7d95d0b946"} Apr 23 18:40:56.700720 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.700651 2578 scope.go:117] "RemoveContainer" containerID="893b502d045695a58958a96b99d59e3221f7e7a8c86041f40600e61fdf82acc8" Apr 23 18:40:56.709773 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.709752 2578 scope.go:117] "RemoveContainer" containerID="252047bcd375c5968b8be25c313ce860fda3583908110245c348375389778e47" Apr 23 18:40:56.717342 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.717325 2578 scope.go:117] "RemoveContainer" containerID="fae7481f7761265b53a0488d9a8ea70069a54351a450ded3d641efbc6600c846" Apr 23 18:40:56.723057 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.723027 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km"] Apr 23 18:40:56.725475 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.725452 2578 scope.go:117] "RemoveContainer" containerID="893b502d045695a58958a96b99d59e3221f7e7a8c86041f40600e61fdf82acc8" Apr 23 18:40:56.726025 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:40:56.725862 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"893b502d045695a58958a96b99d59e3221f7e7a8c86041f40600e61fdf82acc8\": container with ID starting with 893b502d045695a58958a96b99d59e3221f7e7a8c86041f40600e61fdf82acc8 not found: ID does not exist" containerID="893b502d045695a58958a96b99d59e3221f7e7a8c86041f40600e61fdf82acc8" Apr 23 18:40:56.726025 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.725934 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"893b502d045695a58958a96b99d59e3221f7e7a8c86041f40600e61fdf82acc8"} err="failed to get container status \"893b502d045695a58958a96b99d59e3221f7e7a8c86041f40600e61fdf82acc8\": rpc error: code = NotFound desc = could not find container \"893b502d045695a58958a96b99d59e3221f7e7a8c86041f40600e61fdf82acc8\": container with ID starting with 893b502d045695a58958a96b99d59e3221f7e7a8c86041f40600e61fdf82acc8 not found: ID does not exist" Apr 23 18:40:56.726025 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.725976 2578 scope.go:117] "RemoveContainer" containerID="252047bcd375c5968b8be25c313ce860fda3583908110245c348375389778e47" Apr 23 18:40:56.726301 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:40:56.726272 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"252047bcd375c5968b8be25c313ce860fda3583908110245c348375389778e47\": container with ID starting with 252047bcd375c5968b8be25c313ce860fda3583908110245c348375389778e47 not found: ID does not exist" containerID="252047bcd375c5968b8be25c313ce860fda3583908110245c348375389778e47" Apr 23 18:40:56.726421 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.726303 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"252047bcd375c5968b8be25c313ce860fda3583908110245c348375389778e47"} err="failed to get container status \"252047bcd375c5968b8be25c313ce860fda3583908110245c348375389778e47\": rpc error: code = NotFound desc = could not find container \"252047bcd375c5968b8be25c313ce860fda3583908110245c348375389778e47\": container with ID starting with 252047bcd375c5968b8be25c313ce860fda3583908110245c348375389778e47 not found: ID does not exist" Apr 23 18:40:56.726421 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.726326 2578 scope.go:117] "RemoveContainer" containerID="fae7481f7761265b53a0488d9a8ea70069a54351a450ded3d641efbc6600c846" Apr 23 18:40:56.726707 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:40:56.726681 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fae7481f7761265b53a0488d9a8ea70069a54351a450ded3d641efbc6600c846\": container with ID starting with fae7481f7761265b53a0488d9a8ea70069a54351a450ded3d641efbc6600c846 not found: ID does not exist" containerID="fae7481f7761265b53a0488d9a8ea70069a54351a450ded3d641efbc6600c846" Apr 23 18:40:56.726784 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.726716 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fae7481f7761265b53a0488d9a8ea70069a54351a450ded3d641efbc6600c846"} err="failed to get container status \"fae7481f7761265b53a0488d9a8ea70069a54351a450ded3d641efbc6600c846\": rpc error: code = NotFound desc = could not find container \"fae7481f7761265b53a0488d9a8ea70069a54351a450ded3d641efbc6600c846\": container with ID starting with fae7481f7761265b53a0488d9a8ea70069a54351a450ded3d641efbc6600c846 not found: ID does not exist" Apr 23 18:40:56.727524 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:56.727503 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8m8km"] Apr 23 18:40:58.358633 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:40:58.358591 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0536c0dc-d064-4425-a56d-8b149f134864" path="/var/lib/kubelet/pods/0536c0dc-d064-4425-a56d-8b149f134864/volumes" Apr 23 18:41:59.510584 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.510483 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-64444cd4cc-bz564"] Apr 23 18:41:59.511486 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.511455 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0536c0dc-d064-4425-a56d-8b149f134864" containerName="kserve-container" Apr 23 18:41:59.511486 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.511485 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0536c0dc-d064-4425-a56d-8b149f134864" containerName="kserve-container" Apr 23 18:41:59.511725 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.511495 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0536c0dc-d064-4425-a56d-8b149f134864" containerName="kube-rbac-proxy" Apr 23 18:41:59.511725 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.511503 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0536c0dc-d064-4425-a56d-8b149f134864" containerName="kube-rbac-proxy" Apr 23 18:41:59.511725 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.511514 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0536c0dc-d064-4425-a56d-8b149f134864" containerName="storage-initializer" Apr 23 18:41:59.511725 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.511521 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0536c0dc-d064-4425-a56d-8b149f134864" containerName="storage-initializer" Apr 23 18:41:59.511725 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.511572 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" containerName="kube-rbac-proxy" Apr 23 18:41:59.511725 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.511581 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" containerName="kube-rbac-proxy" Apr 23 18:41:59.511725 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.511597 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" containerName="kserve-container" Apr 23 18:41:59.511725 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.511605 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" containerName="kserve-container" Apr 23 18:41:59.511725 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.511620 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" containerName="storage-initializer" Apr 23 18:41:59.511725 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.511628 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" containerName="storage-initializer" Apr 23 18:41:59.511725 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.511706 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" containerName="kserve-container" Apr 23 18:41:59.511725 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.511721 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb1ffc02-5e07-4960-8e15-a7fb96a48ec7" containerName="kube-rbac-proxy" Apr 23 18:41:59.511725 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.511732 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0536c0dc-d064-4425-a56d-8b149f134864" containerName="kube-rbac-proxy" Apr 23 18:41:59.512329 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.511745 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0536c0dc-d064-4425-a56d-8b149f134864" containerName="kserve-container" Apr 23 18:41:59.514902 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.514879 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.517052 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.517017 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 18:41:59.517164 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.517090 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 18:41:59.517164 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.517104 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 18:41:59.517164 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.517110 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 18:41:59.517633 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.517615 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 18:41:59.517739 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.517666 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 18:41:59.517801 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.517764 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 18:41:59.517801 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.517799 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jm8xd\"" Apr 23 18:41:59.524440 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.524419 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 18:41:59.550900 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.550871 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64444cd4cc-bz564"] Apr 23 18:41:59.622458 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.622416 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/499a7a87-1fe5-4e78-8769-1cdd08a5ad3c-console-oauth-config\") pod \"console-64444cd4cc-bz564\" (UID: \"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c\") " pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.622458 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.622451 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499a7a87-1fe5-4e78-8769-1cdd08a5ad3c-trusted-ca-bundle\") pod \"console-64444cd4cc-bz564\" (UID: \"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c\") " pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.622733 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.622479 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/499a7a87-1fe5-4e78-8769-1cdd08a5ad3c-service-ca\") pod \"console-64444cd4cc-bz564\" (UID: \"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c\") " pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.622733 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.622634 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pb6w\" (UniqueName: \"kubernetes.io/projected/499a7a87-1fe5-4e78-8769-1cdd08a5ad3c-kube-api-access-8pb6w\") pod \"console-64444cd4cc-bz564\" (UID: \"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c\") " pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.622733 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.622678 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/499a7a87-1fe5-4e78-8769-1cdd08a5ad3c-console-serving-cert\") pod \"console-64444cd4cc-bz564\" (UID: \"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c\") " pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.622881 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.622728 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/499a7a87-1fe5-4e78-8769-1cdd08a5ad3c-oauth-serving-cert\") pod \"console-64444cd4cc-bz564\" (UID: \"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c\") " pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.622881 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.622754 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/499a7a87-1fe5-4e78-8769-1cdd08a5ad3c-console-config\") pod \"console-64444cd4cc-bz564\" (UID: \"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c\") " pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.723841 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.723791 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8pb6w\" (UniqueName: \"kubernetes.io/projected/499a7a87-1fe5-4e78-8769-1cdd08a5ad3c-kube-api-access-8pb6w\") pod \"console-64444cd4cc-bz564\" (UID: \"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c\") " pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.723841 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.723846 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/499a7a87-1fe5-4e78-8769-1cdd08a5ad3c-console-serving-cert\") pod \"console-64444cd4cc-bz564\" (UID: \"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c\") " pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.724101 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.723868 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/499a7a87-1fe5-4e78-8769-1cdd08a5ad3c-oauth-serving-cert\") pod \"console-64444cd4cc-bz564\" (UID: \"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c\") " pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.724101 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.723890 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/499a7a87-1fe5-4e78-8769-1cdd08a5ad3c-console-config\") pod \"console-64444cd4cc-bz564\" (UID: \"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c\") " pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.724101 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.723913 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/499a7a87-1fe5-4e78-8769-1cdd08a5ad3c-console-oauth-config\") pod \"console-64444cd4cc-bz564\" (UID: \"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c\") " pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.724101 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.723927 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499a7a87-1fe5-4e78-8769-1cdd08a5ad3c-trusted-ca-bundle\") pod \"console-64444cd4cc-bz564\" (UID: \"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c\") " pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.724101 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.723955 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/499a7a87-1fe5-4e78-8769-1cdd08a5ad3c-service-ca\") pod \"console-64444cd4cc-bz564\" (UID: \"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c\") " pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.724776 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.724745 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/499a7a87-1fe5-4e78-8769-1cdd08a5ad3c-oauth-serving-cert\") pod \"console-64444cd4cc-bz564\" (UID: \"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c\") " pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.724922 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.724750 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/499a7a87-1fe5-4e78-8769-1cdd08a5ad3c-console-config\") pod \"console-64444cd4cc-bz564\" (UID: \"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c\") " pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.724922 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.724755 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/499a7a87-1fe5-4e78-8769-1cdd08a5ad3c-service-ca\") pod \"console-64444cd4cc-bz564\" (UID: \"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c\") " pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.725044 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.724979 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499a7a87-1fe5-4e78-8769-1cdd08a5ad3c-trusted-ca-bundle\") pod \"console-64444cd4cc-bz564\" (UID: \"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c\") " pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.726461 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.726434 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/499a7a87-1fe5-4e78-8769-1cdd08a5ad3c-console-oauth-config\") pod \"console-64444cd4cc-bz564\" (UID: \"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c\") " pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.726725 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.726705 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/499a7a87-1fe5-4e78-8769-1cdd08a5ad3c-console-serving-cert\") pod \"console-64444cd4cc-bz564\" (UID: \"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c\") " pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.733861 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.733841 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pb6w\" (UniqueName: \"kubernetes.io/projected/499a7a87-1fe5-4e78-8769-1cdd08a5ad3c-kube-api-access-8pb6w\") pod \"console-64444cd4cc-bz564\" (UID: \"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c\") " pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.824892 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.824803 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:41:59.953114 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:41:59.953079 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64444cd4cc-bz564"] Apr 23 18:41:59.956309 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:41:59.956281 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod499a7a87_1fe5_4e78_8769_1cdd08a5ad3c.slice/crio-3f6e4405e116f30f93fe14200e712bd6542ce5931ce34e267c34825c8f20bdf4 WatchSource:0}: Error finding container 3f6e4405e116f30f93fe14200e712bd6542ce5931ce34e267c34825c8f20bdf4: Status 404 returned error can't find the container with id 3f6e4405e116f30f93fe14200e712bd6542ce5931ce34e267c34825c8f20bdf4 Apr 23 18:42:00.918682 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:42:00.918635 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64444cd4cc-bz564" event={"ID":"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c","Type":"ContainerStarted","Data":"4fc4309389b40e4064850c8c80042915b0fba384d22839bf83bc8ee5f2749ac8"} Apr 23 18:42:00.918682 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:42:00.918683 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64444cd4cc-bz564" event={"ID":"499a7a87-1fe5-4e78-8769-1cdd08a5ad3c","Type":"ContainerStarted","Data":"3f6e4405e116f30f93fe14200e712bd6542ce5931ce34e267c34825c8f20bdf4"} Apr 23 18:42:00.936938 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:42:00.936882 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64444cd4cc-bz564" podStartSLOduration=1.936868178 podStartE2EDuration="1.936868178s" podCreationTimestamp="2026-04-23 18:41:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:42:00.935801833 +0000 UTC m=+2605.169543750" watchObservedRunningTime="2026-04-23 18:42:00.936868178 +0000 UTC m=+2605.170610094" Apr 23 18:42:09.825752 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:42:09.825693 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:42:09.825752 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:42:09.825757 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:42:09.830818 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:42:09.830786 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:42:09.956717 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:42:09.956673 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64444cd4cc-bz564" Apr 23 18:43:36.414594 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:43:36.414566 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:43:36.420882 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:43:36.420860 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:43:36.423156 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:43:36.423136 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:43:36.429008 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:43:36.428988 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:47:31.439413 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:31.439377 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng"] Apr 23 18:47:31.443003 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:31.442980 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" Apr 23 18:47:31.445655 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:31.445524 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-predictor-serving-cert\"" Apr 23 18:47:31.445822 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:31.445676 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 18:47:31.445822 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:31.445798 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:47:31.445822 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:31.445812 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t4fg7\"" Apr 23 18:47:31.446125 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:31.446106 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-kube-rbac-proxy-sar-config\"" Apr 23 18:47:31.460590 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:31.460474 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng"] Apr 23 18:47:31.540259 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:31.540226 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74n84\" (UniqueName: \"kubernetes.io/projected/74bd9057-45b9-4ccb-9449-e286b47d1437-kube-api-access-74n84\") pod \"isvc-tensorflow-predictor-6756f669d7-zkcng\" (UID: \"74bd9057-45b9-4ccb-9449-e286b47d1437\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" Apr 23 18:47:31.540438 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:31.540272 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/74bd9057-45b9-4ccb-9449-e286b47d1437-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-zkcng\" (UID: \"74bd9057-45b9-4ccb-9449-e286b47d1437\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" Apr 23 18:47:31.540438 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:31.540360 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74bd9057-45b9-4ccb-9449-e286b47d1437-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-zkcng\" (UID: \"74bd9057-45b9-4ccb-9449-e286b47d1437\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" Apr 23 18:47:31.540438 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:31.540418 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74bd9057-45b9-4ccb-9449-e286b47d1437-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-zkcng\" (UID: \"74bd9057-45b9-4ccb-9449-e286b47d1437\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" Apr 23 18:47:31.640862 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:31.640822 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74n84\" (UniqueName: \"kubernetes.io/projected/74bd9057-45b9-4ccb-9449-e286b47d1437-kube-api-access-74n84\") pod \"isvc-tensorflow-predictor-6756f669d7-zkcng\" (UID: \"74bd9057-45b9-4ccb-9449-e286b47d1437\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" Apr 23 18:47:31.641055 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:31.640870 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/74bd9057-45b9-4ccb-9449-e286b47d1437-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-zkcng\" (UID: \"74bd9057-45b9-4ccb-9449-e286b47d1437\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" Apr 23 18:47:31.641055 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:31.640918 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74bd9057-45b9-4ccb-9449-e286b47d1437-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-zkcng\" (UID: \"74bd9057-45b9-4ccb-9449-e286b47d1437\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" Apr 23 18:47:31.641055 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:31.640951 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74bd9057-45b9-4ccb-9449-e286b47d1437-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-zkcng\" (UID: \"74bd9057-45b9-4ccb-9449-e286b47d1437\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" Apr 23 18:47:31.641224 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:47:31.641073 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-tensorflow-predictor-serving-cert: secret "isvc-tensorflow-predictor-serving-cert" not found Apr 23 18:47:31.641224 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:47:31.641152 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74bd9057-45b9-4ccb-9449-e286b47d1437-proxy-tls podName:74bd9057-45b9-4ccb-9449-e286b47d1437 nodeName:}" failed. No retries permitted until 2026-04-23 18:47:32.141131057 +0000 UTC m=+2936.374872958 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/74bd9057-45b9-4ccb-9449-e286b47d1437-proxy-tls") pod "isvc-tensorflow-predictor-6756f669d7-zkcng" (UID: "74bd9057-45b9-4ccb-9449-e286b47d1437") : secret "isvc-tensorflow-predictor-serving-cert" not found Apr 23 18:47:31.641328 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:31.641306 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74bd9057-45b9-4ccb-9449-e286b47d1437-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-zkcng\" (UID: \"74bd9057-45b9-4ccb-9449-e286b47d1437\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" Apr 23 18:47:31.641717 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:31.641699 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/74bd9057-45b9-4ccb-9449-e286b47d1437-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-zkcng\" (UID: \"74bd9057-45b9-4ccb-9449-e286b47d1437\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" Apr 23 18:47:31.652047 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:31.652013 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74n84\" (UniqueName: \"kubernetes.io/projected/74bd9057-45b9-4ccb-9449-e286b47d1437-kube-api-access-74n84\") pod \"isvc-tensorflow-predictor-6756f669d7-zkcng\" (UID: \"74bd9057-45b9-4ccb-9449-e286b47d1437\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" Apr 23 18:47:32.145710 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:32.145667 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74bd9057-45b9-4ccb-9449-e286b47d1437-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-zkcng\" (UID: \"74bd9057-45b9-4ccb-9449-e286b47d1437\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" Apr 23 18:47:32.148102 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:32.148072 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74bd9057-45b9-4ccb-9449-e286b47d1437-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-zkcng\" (UID: \"74bd9057-45b9-4ccb-9449-e286b47d1437\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" Apr 23 18:47:32.353546 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:32.353503 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" Apr 23 18:47:32.490745 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:32.490714 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng"] Apr 23 18:47:32.493883 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:47:32.493850 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74bd9057_45b9_4ccb_9449_e286b47d1437.slice/crio-906d702eb59f7120e6a2cc93e16601bad45f2bbdc50777bd80cd930baeadb947 WatchSource:0}: Error finding container 906d702eb59f7120e6a2cc93e16601bad45f2bbdc50777bd80cd930baeadb947: Status 404 returned error can't find the container with id 906d702eb59f7120e6a2cc93e16601bad45f2bbdc50777bd80cd930baeadb947 Apr 23 18:47:32.495750 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:32.495729 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:47:33.040713 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:33.040668 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" event={"ID":"74bd9057-45b9-4ccb-9449-e286b47d1437","Type":"ContainerStarted","Data":"0178bfbffcb2b11223fa30dcd8dbb4d8ec4a9fb958765e987a391c22e0ebfe06"} Apr 23 18:47:33.040897 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:33.040721 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" event={"ID":"74bd9057-45b9-4ccb-9449-e286b47d1437","Type":"ContainerStarted","Data":"906d702eb59f7120e6a2cc93e16601bad45f2bbdc50777bd80cd930baeadb947"} Apr 23 18:47:38.058881 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:38.058836 2578 generic.go:358] "Generic (PLEG): container finished" podID="74bd9057-45b9-4ccb-9449-e286b47d1437" containerID="0178bfbffcb2b11223fa30dcd8dbb4d8ec4a9fb958765e987a391c22e0ebfe06" exitCode=0 Apr 23 18:47:38.059296 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:38.058909 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" event={"ID":"74bd9057-45b9-4ccb-9449-e286b47d1437","Type":"ContainerDied","Data":"0178bfbffcb2b11223fa30dcd8dbb4d8ec4a9fb958765e987a391c22e0ebfe06"} Apr 23 18:47:42.077396 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:42.077353 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" event={"ID":"74bd9057-45b9-4ccb-9449-e286b47d1437","Type":"ContainerStarted","Data":"f4c25b20a93f56eaf38417c311d7067a6a70a0eae3c8d095ab704f09d440e1d9"} Apr 23 18:47:42.077396 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:42.077399 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" event={"ID":"74bd9057-45b9-4ccb-9449-e286b47d1437","Type":"ContainerStarted","Data":"2a12ae8f02116d26910093687f0b059a1714645aca397cf5e37af957c8832694"} Apr 23 18:47:42.077915 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:42.077690 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" Apr 23 18:47:42.077915 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:42.077724 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" Apr 23 18:47:42.079076 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:42.079046 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" podUID="74bd9057-45b9-4ccb-9449-e286b47d1437" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 18:47:42.101600 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:42.101513 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" podStartSLOduration=7.530285143 podStartE2EDuration="11.101497576s" podCreationTimestamp="2026-04-23 18:47:31 +0000 UTC" firstStartedPulling="2026-04-23 18:47:38.060093047 +0000 UTC m=+2942.293834946" lastFinishedPulling="2026-04-23 18:47:41.631305472 +0000 UTC m=+2945.865047379" observedRunningTime="2026-04-23 18:47:42.099121511 +0000 UTC m=+2946.332863462" watchObservedRunningTime="2026-04-23 18:47:42.101497576 +0000 UTC m=+2946.335239493" Apr 23 18:47:43.080836 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:43.080794 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" podUID="74bd9057-45b9-4ccb-9449-e286b47d1437" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 18:47:48.085278 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:48.085242 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" Apr 23 18:47:48.085864 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:48.085837 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" podUID="74bd9057-45b9-4ccb-9449-e286b47d1437" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 18:47:58.086628 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:47:58.086522 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" Apr 23 18:48:13.265642 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:13.265599 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng"] Apr 23 18:48:13.266090 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:13.266036 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" podUID="74bd9057-45b9-4ccb-9449-e286b47d1437" containerName="kserve-container" containerID="cri-o://2a12ae8f02116d26910093687f0b059a1714645aca397cf5e37af957c8832694" gracePeriod=30 Apr 23 18:48:13.266169 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:13.266072 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" podUID="74bd9057-45b9-4ccb-9449-e286b47d1437" containerName="kube-rbac-proxy" containerID="cri-o://f4c25b20a93f56eaf38417c311d7067a6a70a0eae3c8d095ab704f09d440e1d9" gracePeriod=30 Apr 23 18:48:14.191779 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:14.191744 2578 generic.go:358] "Generic (PLEG): container finished" podID="74bd9057-45b9-4ccb-9449-e286b47d1437" containerID="f4c25b20a93f56eaf38417c311d7067a6a70a0eae3c8d095ab704f09d440e1d9" exitCode=2 Apr 23 18:48:14.191962 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:14.191816 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" event={"ID":"74bd9057-45b9-4ccb-9449-e286b47d1437","Type":"ContainerDied","Data":"f4c25b20a93f56eaf38417c311d7067a6a70a0eae3c8d095ab704f09d440e1d9"} Apr 23 18:48:18.081233 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:18.081186 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" podUID="74bd9057-45b9-4ccb-9449-e286b47d1437" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.43:8643/healthz\": dial tcp 10.133.0.43:8643: connect: connection refused" Apr 23 18:48:23.081818 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:23.081772 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" podUID="74bd9057-45b9-4ccb-9449-e286b47d1437" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.43:8643/healthz\": dial tcp 10.133.0.43:8643: connect: connection refused" Apr 23 18:48:28.081649 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:28.081600 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" podUID="74bd9057-45b9-4ccb-9449-e286b47d1437" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.43:8643/healthz\": dial tcp 10.133.0.43:8643: connect: connection refused" Apr 23 18:48:28.082087 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:28.081741 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" Apr 23 18:48:33.081261 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:33.081217 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" podUID="74bd9057-45b9-4ccb-9449-e286b47d1437" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.43:8643/healthz\": dial tcp 10.133.0.43:8643: connect: connection refused" Apr 23 18:48:36.436713 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:36.436679 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:48:36.443389 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:36.443360 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:48:36.446391 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:36.446370 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:48:36.452460 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:36.452433 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:48:38.081461 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:38.081407 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" podUID="74bd9057-45b9-4ccb-9449-e286b47d1437" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.43:8643/healthz\": dial tcp 10.133.0.43:8643: connect: connection refused" Apr 23 18:48:43.081068 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:43.081020 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" podUID="74bd9057-45b9-4ccb-9449-e286b47d1437" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.43:8643/healthz\": dial tcp 10.133.0.43:8643: connect: connection refused" Apr 23 18:48:43.911810 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:43.911779 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" Apr 23 18:48:43.927903 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:43.927874 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74bd9057-45b9-4ccb-9449-e286b47d1437-kserve-provision-location\") pod \"74bd9057-45b9-4ccb-9449-e286b47d1437\" (UID: \"74bd9057-45b9-4ccb-9449-e286b47d1437\") " Apr 23 18:48:43.928079 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:43.927911 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74bd9057-45b9-4ccb-9449-e286b47d1437-proxy-tls\") pod \"74bd9057-45b9-4ccb-9449-e286b47d1437\" (UID: \"74bd9057-45b9-4ccb-9449-e286b47d1437\") " Apr 23 18:48:43.928079 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:43.928015 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/74bd9057-45b9-4ccb-9449-e286b47d1437-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"74bd9057-45b9-4ccb-9449-e286b47d1437\" (UID: \"74bd9057-45b9-4ccb-9449-e286b47d1437\") " Apr 23 18:48:43.928079 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:43.928054 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74n84\" (UniqueName: \"kubernetes.io/projected/74bd9057-45b9-4ccb-9449-e286b47d1437-kube-api-access-74n84\") pod \"74bd9057-45b9-4ccb-9449-e286b47d1437\" (UID: \"74bd9057-45b9-4ccb-9449-e286b47d1437\") " Apr 23 18:48:43.928420 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:43.928385 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74bd9057-45b9-4ccb-9449-e286b47d1437-isvc-tensorflow-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-kube-rbac-proxy-sar-config") pod "74bd9057-45b9-4ccb-9449-e286b47d1437" (UID: "74bd9057-45b9-4ccb-9449-e286b47d1437"). InnerVolumeSpecName "isvc-tensorflow-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:48:43.931844 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:43.931810 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74bd9057-45b9-4ccb-9449-e286b47d1437-kube-api-access-74n84" (OuterVolumeSpecName: "kube-api-access-74n84") pod "74bd9057-45b9-4ccb-9449-e286b47d1437" (UID: "74bd9057-45b9-4ccb-9449-e286b47d1437"). InnerVolumeSpecName "kube-api-access-74n84". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:48:43.933473 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:43.933444 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74bd9057-45b9-4ccb-9449-e286b47d1437-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "74bd9057-45b9-4ccb-9449-e286b47d1437" (UID: "74bd9057-45b9-4ccb-9449-e286b47d1437"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:48:43.941529 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:43.941499 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74bd9057-45b9-4ccb-9449-e286b47d1437-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "74bd9057-45b9-4ccb-9449-e286b47d1437" (UID: "74bd9057-45b9-4ccb-9449-e286b47d1437"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:48:44.028669 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:44.028617 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-74n84\" (UniqueName: \"kubernetes.io/projected/74bd9057-45b9-4ccb-9449-e286b47d1437-kube-api-access-74n84\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:48:44.028669 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:44.028659 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74bd9057-45b9-4ccb-9449-e286b47d1437-kserve-provision-location\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:48:44.028669 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:44.028671 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74bd9057-45b9-4ccb-9449-e286b47d1437-proxy-tls\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:48:44.028914 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:44.028683 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/74bd9057-45b9-4ccb-9449-e286b47d1437-isvc-tensorflow-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:48:44.293217 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:44.293179 2578 generic.go:358] "Generic (PLEG): container finished" podID="74bd9057-45b9-4ccb-9449-e286b47d1437" containerID="2a12ae8f02116d26910093687f0b059a1714645aca397cf5e37af957c8832694" exitCode=137 Apr 23 18:48:44.293763 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:44.293262 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" event={"ID":"74bd9057-45b9-4ccb-9449-e286b47d1437","Type":"ContainerDied","Data":"2a12ae8f02116d26910093687f0b059a1714645aca397cf5e37af957c8832694"} Apr 23 18:48:44.293763 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:44.293304 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" event={"ID":"74bd9057-45b9-4ccb-9449-e286b47d1437","Type":"ContainerDied","Data":"906d702eb59f7120e6a2cc93e16601bad45f2bbdc50777bd80cd930baeadb947"} Apr 23 18:48:44.293763 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:44.293321 2578 scope.go:117] "RemoveContainer" containerID="f4c25b20a93f56eaf38417c311d7067a6a70a0eae3c8d095ab704f09d440e1d9" Apr 23 18:48:44.293763 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:44.293272 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng" Apr 23 18:48:44.301890 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:44.301868 2578 scope.go:117] "RemoveContainer" containerID="2a12ae8f02116d26910093687f0b059a1714645aca397cf5e37af957c8832694" Apr 23 18:48:44.309032 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:44.309013 2578 scope.go:117] "RemoveContainer" containerID="0178bfbffcb2b11223fa30dcd8dbb4d8ec4a9fb958765e987a391c22e0ebfe06" Apr 23 18:48:44.315189 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:44.315163 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng"] Apr 23 18:48:44.317150 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:44.317126 2578 scope.go:117] "RemoveContainer" containerID="f4c25b20a93f56eaf38417c311d7067a6a70a0eae3c8d095ab704f09d440e1d9" Apr 23 18:48:44.317460 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:48:44.317439 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4c25b20a93f56eaf38417c311d7067a6a70a0eae3c8d095ab704f09d440e1d9\": container with ID starting with f4c25b20a93f56eaf38417c311d7067a6a70a0eae3c8d095ab704f09d440e1d9 not found: ID does not exist" containerID="f4c25b20a93f56eaf38417c311d7067a6a70a0eae3c8d095ab704f09d440e1d9" Apr 23 18:48:44.317561 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:44.317469 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4c25b20a93f56eaf38417c311d7067a6a70a0eae3c8d095ab704f09d440e1d9"} err="failed to get container status \"f4c25b20a93f56eaf38417c311d7067a6a70a0eae3c8d095ab704f09d440e1d9\": rpc error: code = NotFound desc = could not find container \"f4c25b20a93f56eaf38417c311d7067a6a70a0eae3c8d095ab704f09d440e1d9\": container with ID starting with f4c25b20a93f56eaf38417c311d7067a6a70a0eae3c8d095ab704f09d440e1d9 not found: ID does not exist" Apr 23 18:48:44.317561 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:44.317487 2578 scope.go:117] "RemoveContainer" containerID="2a12ae8f02116d26910093687f0b059a1714645aca397cf5e37af957c8832694" Apr 23 18:48:44.317797 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:48:44.317772 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a12ae8f02116d26910093687f0b059a1714645aca397cf5e37af957c8832694\": container with ID starting with 2a12ae8f02116d26910093687f0b059a1714645aca397cf5e37af957c8832694 not found: ID does not exist" containerID="2a12ae8f02116d26910093687f0b059a1714645aca397cf5e37af957c8832694" Apr 23 18:48:44.317856 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:44.317807 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a12ae8f02116d26910093687f0b059a1714645aca397cf5e37af957c8832694"} err="failed to get container status \"2a12ae8f02116d26910093687f0b059a1714645aca397cf5e37af957c8832694\": rpc error: code = NotFound desc = could not find container \"2a12ae8f02116d26910093687f0b059a1714645aca397cf5e37af957c8832694\": container with ID starting with 2a12ae8f02116d26910093687f0b059a1714645aca397cf5e37af957c8832694 not found: ID does not exist" Apr 23 18:48:44.317856 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:44.317832 2578 scope.go:117] "RemoveContainer" containerID="0178bfbffcb2b11223fa30dcd8dbb4d8ec4a9fb958765e987a391c22e0ebfe06" Apr 23 18:48:44.318105 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:48:44.318087 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0178bfbffcb2b11223fa30dcd8dbb4d8ec4a9fb958765e987a391c22e0ebfe06\": container with ID starting with 0178bfbffcb2b11223fa30dcd8dbb4d8ec4a9fb958765e987a391c22e0ebfe06 not found: ID does not exist" containerID="0178bfbffcb2b11223fa30dcd8dbb4d8ec4a9fb958765e987a391c22e0ebfe06" Apr 23 18:48:44.318169 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:44.318112 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0178bfbffcb2b11223fa30dcd8dbb4d8ec4a9fb958765e987a391c22e0ebfe06"} err="failed to get container status \"0178bfbffcb2b11223fa30dcd8dbb4d8ec4a9fb958765e987a391c22e0ebfe06\": rpc error: code = NotFound desc = could not find container \"0178bfbffcb2b11223fa30dcd8dbb4d8ec4a9fb958765e987a391c22e0ebfe06\": container with ID starting with 0178bfbffcb2b11223fa30dcd8dbb4d8ec4a9fb958765e987a391c22e0ebfe06 not found: ID does not exist" Apr 23 18:48:44.318694 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:44.318671 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zkcng"] Apr 23 18:48:44.357306 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:44.357272 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74bd9057-45b9-4ccb-9449-e286b47d1437" path="/var/lib/kubelet/pods/74bd9057-45b9-4ccb-9449-e286b47d1437/volumes" Apr 23 18:48:54.115437 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.115398 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp"] Apr 23 18:48:54.115876 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.115752 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="74bd9057-45b9-4ccb-9449-e286b47d1437" containerName="kserve-container" Apr 23 18:48:54.115876 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.115763 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="74bd9057-45b9-4ccb-9449-e286b47d1437" containerName="kserve-container" Apr 23 18:48:54.115876 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.115779 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="74bd9057-45b9-4ccb-9449-e286b47d1437" containerName="storage-initializer" Apr 23 18:48:54.115876 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.115786 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="74bd9057-45b9-4ccb-9449-e286b47d1437" containerName="storage-initializer" Apr 23 18:48:54.115876 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.115796 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="74bd9057-45b9-4ccb-9449-e286b47d1437" containerName="kube-rbac-proxy" Apr 23 18:48:54.115876 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.115803 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="74bd9057-45b9-4ccb-9449-e286b47d1437" containerName="kube-rbac-proxy" Apr 23 18:48:54.115876 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.115856 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="74bd9057-45b9-4ccb-9449-e286b47d1437" containerName="kserve-container" Apr 23 18:48:54.115876 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.115864 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="74bd9057-45b9-4ccb-9449-e286b47d1437" containerName="kube-rbac-proxy" Apr 23 18:48:54.120589 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.120560 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" Apr 23 18:48:54.122508 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.122480 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-predictor-serving-cert\"" Apr 23 18:48:54.122954 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.122934 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 18:48:54.123151 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.123122 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t4fg7\"" Apr 23 18:48:54.123215 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.123175 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-kube-rbac-proxy-sar-config\"" Apr 23 18:48:54.123215 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.123176 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:48:54.127111 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.127059 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp"] Apr 23 18:48:54.212805 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.212764 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-96znp\" (UID: \"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" Apr 23 18:48:54.213034 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.212816 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q62bf\" (UniqueName: \"kubernetes.io/projected/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-kube-api-access-q62bf\") pod \"isvc-triton-predictor-84bb65d94b-96znp\" (UID: \"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" Apr 23 18:48:54.213034 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.212895 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-96znp\" (UID: \"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" Apr 23 18:48:54.213034 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.212938 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-96znp\" (UID: \"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" Apr 23 18:48:54.313721 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.313678 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-96znp\" (UID: \"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" Apr 23 18:48:54.313863 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.313733 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-96znp\" (UID: \"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" Apr 23 18:48:54.313863 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.313782 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-96znp\" (UID: \"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" Apr 23 18:48:54.313863 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.313832 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q62bf\" (UniqueName: \"kubernetes.io/projected/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-kube-api-access-q62bf\") pod \"isvc-triton-predictor-84bb65d94b-96znp\" (UID: \"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" Apr 23 18:48:54.313987 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:48:54.313932 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-triton-predictor-serving-cert: secret "isvc-triton-predictor-serving-cert" not found Apr 23 18:48:54.314034 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:48:54.313994 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-proxy-tls podName:5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2 nodeName:}" failed. No retries permitted until 2026-04-23 18:48:54.813978699 +0000 UTC m=+3019.047720595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-proxy-tls") pod "isvc-triton-predictor-84bb65d94b-96znp" (UID: "5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2") : secret "isvc-triton-predictor-serving-cert" not found Apr 23 18:48:54.314226 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.314208 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-96znp\" (UID: \"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" Apr 23 18:48:54.314450 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.314431 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-96znp\" (UID: \"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" Apr 23 18:48:54.324161 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.324132 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q62bf\" (UniqueName: \"kubernetes.io/projected/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-kube-api-access-q62bf\") pod \"isvc-triton-predictor-84bb65d94b-96znp\" (UID: \"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" Apr 23 18:48:54.819060 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.819003 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-96znp\" (UID: \"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" Apr 23 18:48:54.821498 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:54.821461 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-96znp\" (UID: \"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" Apr 23 18:48:55.033132 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:55.033090 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" Apr 23 18:48:55.162458 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:55.162420 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp"] Apr 23 18:48:55.165396 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:48:55.165355 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cfcca61_c2aa_4347_83a0_eb1bc98aa3b2.slice/crio-d5cac0f408bbba5609e6ddfcbe598eb91f1a61b9c1e8f9e43a3cf28d14f8b573 WatchSource:0}: Error finding container d5cac0f408bbba5609e6ddfcbe598eb91f1a61b9c1e8f9e43a3cf28d14f8b573: Status 404 returned error can't find the container with id d5cac0f408bbba5609e6ddfcbe598eb91f1a61b9c1e8f9e43a3cf28d14f8b573 Apr 23 18:48:55.332003 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:55.331900 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" event={"ID":"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2","Type":"ContainerStarted","Data":"d14a5062af44f589c6014634251bd4ab97e345e43d317357ae64487211184b9f"} Apr 23 18:48:55.332003 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:55.331952 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" event={"ID":"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2","Type":"ContainerStarted","Data":"d5cac0f408bbba5609e6ddfcbe598eb91f1a61b9c1e8f9e43a3cf28d14f8b573"} Apr 23 18:48:59.347131 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:59.347091 2578 generic.go:358] "Generic (PLEG): container finished" podID="5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2" containerID="d14a5062af44f589c6014634251bd4ab97e345e43d317357ae64487211184b9f" exitCode=0 Apr 23 18:48:59.347513 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:48:59.347167 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" event={"ID":"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2","Type":"ContainerDied","Data":"d14a5062af44f589c6014634251bd4ab97e345e43d317357ae64487211184b9f"} Apr 23 18:50:55.821483 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:50:55.821424 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" event={"ID":"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2","Type":"ContainerStarted","Data":"6699ec5b07aa31cdc2f09eb26669c484140f39949a091076c467abd092ee3759"} Apr 23 18:50:55.821483 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:50:55.821473 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" event={"ID":"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2","Type":"ContainerStarted","Data":"2f8247d4743d7de67b4e7fcbcfcc728e2f646bdac68e47afdd8ff517a2ea3146"} Apr 23 18:50:55.822094 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:50:55.821557 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" Apr 23 18:50:55.854765 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:50:55.854657 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" podStartSLOduration=5.598559269 podStartE2EDuration="2m1.854640321s" podCreationTimestamp="2026-04-23 18:48:54 +0000 UTC" firstStartedPulling="2026-04-23 18:48:59.348344361 +0000 UTC m=+3023.582086256" lastFinishedPulling="2026-04-23 18:50:55.604425413 +0000 UTC m=+3139.838167308" observedRunningTime="2026-04-23 18:50:55.852775162 +0000 UTC m=+3140.086517072" watchObservedRunningTime="2026-04-23 18:50:55.854640321 +0000 UTC m=+3140.088382236" Apr 23 18:50:56.825502 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:50:56.825465 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" Apr 23 18:50:56.826693 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:50:56.826667 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" podUID="5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 18:50:57.829160 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:50:57.829115 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" podUID="5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 18:51:02.833514 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:02.833481 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" Apr 23 18:51:02.834341 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:02.834323 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" Apr 23 18:51:05.685016 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:05.684973 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp"] Apr 23 18:51:05.685514 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:05.685356 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" podUID="5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2" containerName="kserve-container" containerID="cri-o://2f8247d4743d7de67b4e7fcbcfcc728e2f646bdac68e47afdd8ff517a2ea3146" gracePeriod=30 Apr 23 18:51:05.685514 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:05.685458 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" podUID="5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2" containerName="kube-rbac-proxy" containerID="cri-o://6699ec5b07aa31cdc2f09eb26669c484140f39949a091076c467abd092ee3759" gracePeriod=30 Apr 23 18:51:05.789935 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:05.789901 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5"] Apr 23 18:51:05.813491 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:05.813456 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5"] Apr 23 18:51:05.813695 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:05.813637 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" Apr 23 18:51:05.815776 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:05.815747 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-predictor-serving-cert\"" Apr 23 18:51:05.815776 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:05.815748 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-kube-rbac-proxy-sar-config\"" Apr 23 18:51:05.857754 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:05.857718 2578 generic.go:358] "Generic (PLEG): container finished" podID="5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2" containerID="6699ec5b07aa31cdc2f09eb26669c484140f39949a091076c467abd092ee3759" exitCode=2 Apr 23 18:51:05.857926 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:05.857793 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" event={"ID":"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2","Type":"ContainerDied","Data":"6699ec5b07aa31cdc2f09eb26669c484140f39949a091076c467abd092ee3759"} Apr 23 18:51:05.916382 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:05.916348 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30b922f0-bee6-4b62-8b88-906af09d135e-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-rwfk5\" (UID: \"30b922f0-bee6-4b62-8b88-906af09d135e\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" Apr 23 18:51:05.916573 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:05.916409 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggkj7\" (UniqueName: \"kubernetes.io/projected/30b922f0-bee6-4b62-8b88-906af09d135e-kube-api-access-ggkj7\") pod \"isvc-xgboost-predictor-8689c4cfcc-rwfk5\" (UID: \"30b922f0-bee6-4b62-8b88-906af09d135e\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" Apr 23 18:51:05.916573 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:05.916509 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30b922f0-bee6-4b62-8b88-906af09d135e-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-rwfk5\" (UID: \"30b922f0-bee6-4b62-8b88-906af09d135e\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" Apr 23 18:51:05.916679 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:05.916611 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/30b922f0-bee6-4b62-8b88-906af09d135e-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-rwfk5\" (UID: \"30b922f0-bee6-4b62-8b88-906af09d135e\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" Apr 23 18:51:06.017832 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:06.017790 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30b922f0-bee6-4b62-8b88-906af09d135e-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-rwfk5\" (UID: \"30b922f0-bee6-4b62-8b88-906af09d135e\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" Apr 23 18:51:06.018038 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:06.017849 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/30b922f0-bee6-4b62-8b88-906af09d135e-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-rwfk5\" (UID: \"30b922f0-bee6-4b62-8b88-906af09d135e\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" Apr 23 18:51:06.018038 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:06.017883 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30b922f0-bee6-4b62-8b88-906af09d135e-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-rwfk5\" (UID: \"30b922f0-bee6-4b62-8b88-906af09d135e\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" Apr 23 18:51:06.018038 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:06.017928 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggkj7\" (UniqueName: \"kubernetes.io/projected/30b922f0-bee6-4b62-8b88-906af09d135e-kube-api-access-ggkj7\") pod \"isvc-xgboost-predictor-8689c4cfcc-rwfk5\" (UID: \"30b922f0-bee6-4b62-8b88-906af09d135e\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" Apr 23 18:51:06.018038 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:51:06.017971 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-predictor-serving-cert: secret "isvc-xgboost-predictor-serving-cert" not found Apr 23 18:51:06.018250 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:51:06.018061 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30b922f0-bee6-4b62-8b88-906af09d135e-proxy-tls podName:30b922f0-bee6-4b62-8b88-906af09d135e nodeName:}" failed. No retries permitted until 2026-04-23 18:51:06.518039232 +0000 UTC m=+3150.751781126 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/30b922f0-bee6-4b62-8b88-906af09d135e-proxy-tls") pod "isvc-xgboost-predictor-8689c4cfcc-rwfk5" (UID: "30b922f0-bee6-4b62-8b88-906af09d135e") : secret "isvc-xgboost-predictor-serving-cert" not found Apr 23 18:51:06.018307 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:06.018286 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30b922f0-bee6-4b62-8b88-906af09d135e-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-rwfk5\" (UID: \"30b922f0-bee6-4b62-8b88-906af09d135e\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" Apr 23 18:51:06.018643 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:06.018625 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/30b922f0-bee6-4b62-8b88-906af09d135e-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-rwfk5\" (UID: \"30b922f0-bee6-4b62-8b88-906af09d135e\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" Apr 23 18:51:06.027129 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:06.027093 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggkj7\" (UniqueName: \"kubernetes.io/projected/30b922f0-bee6-4b62-8b88-906af09d135e-kube-api-access-ggkj7\") pod \"isvc-xgboost-predictor-8689c4cfcc-rwfk5\" (UID: \"30b922f0-bee6-4b62-8b88-906af09d135e\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" Apr 23 18:51:06.521978 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:06.521938 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30b922f0-bee6-4b62-8b88-906af09d135e-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-rwfk5\" (UID: \"30b922f0-bee6-4b62-8b88-906af09d135e\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" Apr 23 18:51:06.524595 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:06.524566 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30b922f0-bee6-4b62-8b88-906af09d135e-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-rwfk5\" (UID: \"30b922f0-bee6-4b62-8b88-906af09d135e\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" Apr 23 18:51:06.726281 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:06.726238 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" Apr 23 18:51:06.855761 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:06.855728 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5"] Apr 23 18:51:06.858092 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:51:06.858059 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30b922f0_bee6_4b62_8b88_906af09d135e.slice/crio-eddb02ac350f26c455b2e75a8294e74b485510191d0d89aceaa27df2017d94c3 WatchSource:0}: Error finding container eddb02ac350f26c455b2e75a8294e74b485510191d0d89aceaa27df2017d94c3: Status 404 returned error can't find the container with id eddb02ac350f26c455b2e75a8294e74b485510191d0d89aceaa27df2017d94c3 Apr 23 18:51:07.829559 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:07.829494 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" podUID="5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.44:8643/healthz\": dial tcp 10.133.0.44:8643: connect: connection refused" Apr 23 18:51:07.866157 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:07.866117 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" event={"ID":"30b922f0-bee6-4b62-8b88-906af09d135e","Type":"ContainerStarted","Data":"98b183b27b8192dd1f7c46a69a6b41e5b4f148c08be61e2cb17934c4ab839cbd"} Apr 23 18:51:07.866157 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:07.866157 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" event={"ID":"30b922f0-bee6-4b62-8b88-906af09d135e","Type":"ContainerStarted","Data":"eddb02ac350f26c455b2e75a8294e74b485510191d0d89aceaa27df2017d94c3"} Apr 23 18:51:08.532703 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.532679 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" Apr 23 18:51:08.539182 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.539157 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-proxy-tls\") pod \"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2\" (UID: \"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2\") " Apr 23 18:51:08.539335 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.539219 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q62bf\" (UniqueName: \"kubernetes.io/projected/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-kube-api-access-q62bf\") pod \"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2\" (UID: \"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2\") " Apr 23 18:51:08.539335 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.539247 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-kserve-provision-location\") pod \"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2\" (UID: \"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2\") " Apr 23 18:51:08.539335 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.539275 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-isvc-triton-kube-rbac-proxy-sar-config\") pod \"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2\" (UID: \"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2\") " Apr 23 18:51:08.539686 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.539662 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-isvc-triton-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-triton-kube-rbac-proxy-sar-config") pod "5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2" (UID: "5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2"). InnerVolumeSpecName "isvc-triton-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:51:08.539796 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.539688 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2" (UID: "5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:51:08.541253 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.541233 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2" (UID: "5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:51:08.541340 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.541251 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-kube-api-access-q62bf" (OuterVolumeSpecName: "kube-api-access-q62bf") pod "5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2" (UID: "5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2"). InnerVolumeSpecName "kube-api-access-q62bf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:51:08.640027 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.639943 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q62bf\" (UniqueName: \"kubernetes.io/projected/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-kube-api-access-q62bf\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:51:08.640027 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.639973 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-kserve-provision-location\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:51:08.640027 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.639985 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-isvc-triton-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:51:08.640027 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.639995 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2-proxy-tls\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:51:08.871762 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.871723 2578 generic.go:358] "Generic (PLEG): container finished" podID="5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2" containerID="2f8247d4743d7de67b4e7fcbcfcc728e2f646bdac68e47afdd8ff517a2ea3146" exitCode=0 Apr 23 18:51:08.872346 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.871812 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" Apr 23 18:51:08.872346 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.871842 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" event={"ID":"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2","Type":"ContainerDied","Data":"2f8247d4743d7de67b4e7fcbcfcc728e2f646bdac68e47afdd8ff517a2ea3146"} Apr 23 18:51:08.872346 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.871877 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp" event={"ID":"5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2","Type":"ContainerDied","Data":"d5cac0f408bbba5609e6ddfcbe598eb91f1a61b9c1e8f9e43a3cf28d14f8b573"} Apr 23 18:51:08.872346 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.871892 2578 scope.go:117] "RemoveContainer" containerID="6699ec5b07aa31cdc2f09eb26669c484140f39949a091076c467abd092ee3759" Apr 23 18:51:08.880591 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.880569 2578 scope.go:117] "RemoveContainer" containerID="2f8247d4743d7de67b4e7fcbcfcc728e2f646bdac68e47afdd8ff517a2ea3146" Apr 23 18:51:08.888772 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.888712 2578 scope.go:117] "RemoveContainer" containerID="d14a5062af44f589c6014634251bd4ab97e345e43d317357ae64487211184b9f" Apr 23 18:51:08.893659 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.893592 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp"] Apr 23 18:51:08.896650 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.896516 2578 scope.go:117] "RemoveContainer" containerID="6699ec5b07aa31cdc2f09eb26669c484140f39949a091076c467abd092ee3759" Apr 23 18:51:08.896912 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:51:08.896875 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6699ec5b07aa31cdc2f09eb26669c484140f39949a091076c467abd092ee3759\": container with ID starting with 6699ec5b07aa31cdc2f09eb26669c484140f39949a091076c467abd092ee3759 not found: ID does not exist" containerID="6699ec5b07aa31cdc2f09eb26669c484140f39949a091076c467abd092ee3759" Apr 23 18:51:08.896991 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.896926 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6699ec5b07aa31cdc2f09eb26669c484140f39949a091076c467abd092ee3759"} err="failed to get container status \"6699ec5b07aa31cdc2f09eb26669c484140f39949a091076c467abd092ee3759\": rpc error: code = NotFound desc = could not find container \"6699ec5b07aa31cdc2f09eb26669c484140f39949a091076c467abd092ee3759\": container with ID starting with 6699ec5b07aa31cdc2f09eb26669c484140f39949a091076c467abd092ee3759 not found: ID does not exist" Apr 23 18:51:08.896991 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.896954 2578 scope.go:117] "RemoveContainer" containerID="2f8247d4743d7de67b4e7fcbcfcc728e2f646bdac68e47afdd8ff517a2ea3146" Apr 23 18:51:08.897286 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:51:08.897263 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f8247d4743d7de67b4e7fcbcfcc728e2f646bdac68e47afdd8ff517a2ea3146\": container with ID starting with 2f8247d4743d7de67b4e7fcbcfcc728e2f646bdac68e47afdd8ff517a2ea3146 not found: ID does not exist" containerID="2f8247d4743d7de67b4e7fcbcfcc728e2f646bdac68e47afdd8ff517a2ea3146" Apr 23 18:51:08.897404 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.897290 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f8247d4743d7de67b4e7fcbcfcc728e2f646bdac68e47afdd8ff517a2ea3146"} err="failed to get container status \"2f8247d4743d7de67b4e7fcbcfcc728e2f646bdac68e47afdd8ff517a2ea3146\": rpc error: code = NotFound desc = could not find container \"2f8247d4743d7de67b4e7fcbcfcc728e2f646bdac68e47afdd8ff517a2ea3146\": container with ID starting with 2f8247d4743d7de67b4e7fcbcfcc728e2f646bdac68e47afdd8ff517a2ea3146 not found: ID does not exist" Apr 23 18:51:08.897404 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.897313 2578 scope.go:117] "RemoveContainer" containerID="d14a5062af44f589c6014634251bd4ab97e345e43d317357ae64487211184b9f" Apr 23 18:51:08.897701 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:51:08.897668 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d14a5062af44f589c6014634251bd4ab97e345e43d317357ae64487211184b9f\": container with ID starting with d14a5062af44f589c6014634251bd4ab97e345e43d317357ae64487211184b9f not found: ID does not exist" containerID="d14a5062af44f589c6014634251bd4ab97e345e43d317357ae64487211184b9f" Apr 23 18:51:08.897785 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.897709 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d14a5062af44f589c6014634251bd4ab97e345e43d317357ae64487211184b9f"} err="failed to get container status \"d14a5062af44f589c6014634251bd4ab97e345e43d317357ae64487211184b9f\": rpc error: code = NotFound desc = could not find container \"d14a5062af44f589c6014634251bd4ab97e345e43d317357ae64487211184b9f\": container with ID starting with d14a5062af44f589c6014634251bd4ab97e345e43d317357ae64487211184b9f not found: ID does not exist" Apr 23 18:51:08.898306 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:08.898284 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-96znp"] Apr 23 18:51:10.357982 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:10.357943 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2" path="/var/lib/kubelet/pods/5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2/volumes" Apr 23 18:51:10.886395 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:10.886359 2578 generic.go:358] "Generic (PLEG): container finished" podID="30b922f0-bee6-4b62-8b88-906af09d135e" containerID="98b183b27b8192dd1f7c46a69a6b41e5b4f148c08be61e2cb17934c4ab839cbd" exitCode=0 Apr 23 18:51:10.886617 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:10.886419 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" event={"ID":"30b922f0-bee6-4b62-8b88-906af09d135e","Type":"ContainerDied","Data":"98b183b27b8192dd1f7c46a69a6b41e5b4f148c08be61e2cb17934c4ab839cbd"} Apr 23 18:51:30.957956 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:30.957919 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" event={"ID":"30b922f0-bee6-4b62-8b88-906af09d135e","Type":"ContainerStarted","Data":"2bf956d6ebb2d7a3dd3b0ca77a9be0b25f818185cc495f4660f3b13d81d8bed7"} Apr 23 18:51:30.957956 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:30.957964 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" event={"ID":"30b922f0-bee6-4b62-8b88-906af09d135e","Type":"ContainerStarted","Data":"8b9a9683e52a4b4fdd38e56beee2b67b82f9b572fabd3f9ac9676fc328b485e9"} Apr 23 18:51:30.958421 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:30.958174 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" Apr 23 18:51:30.976283 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:30.976228 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" podStartSLOduration=6.491323277 podStartE2EDuration="25.976210781s" podCreationTimestamp="2026-04-23 18:51:05 +0000 UTC" firstStartedPulling="2026-04-23 18:51:10.88759873 +0000 UTC m=+3155.121340624" lastFinishedPulling="2026-04-23 18:51:30.372486221 +0000 UTC m=+3174.606228128" observedRunningTime="2026-04-23 18:51:30.975053725 +0000 UTC m=+3175.208795641" watchObservedRunningTime="2026-04-23 18:51:30.976210781 +0000 UTC m=+3175.209952772" Apr 23 18:51:31.961783 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:31.961737 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" Apr 23 18:51:31.962971 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:31.962949 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" podUID="30b922f0-bee6-4b62-8b88-906af09d135e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:51:32.964722 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:32.964682 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" podUID="30b922f0-bee6-4b62-8b88-906af09d135e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:51:37.969442 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:37.969408 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" Apr 23 18:51:37.969979 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:37.969953 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" podUID="30b922f0-bee6-4b62-8b88-906af09d135e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:51:47.969915 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:47.969874 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" podUID="30b922f0-bee6-4b62-8b88-906af09d135e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:51:57.970631 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:51:57.970587 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" podUID="30b922f0-bee6-4b62-8b88-906af09d135e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:52:07.969959 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:07.969915 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" podUID="30b922f0-bee6-4b62-8b88-906af09d135e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:52:17.970512 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:17.970471 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" podUID="30b922f0-bee6-4b62-8b88-906af09d135e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:52:27.970312 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:27.970218 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" podUID="30b922f0-bee6-4b62-8b88-906af09d135e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:52:37.971096 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:37.971059 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" Apr 23 18:52:45.928900 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:45.928861 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5"] Apr 23 18:52:45.929466 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:45.929271 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" podUID="30b922f0-bee6-4b62-8b88-906af09d135e" containerName="kserve-container" containerID="cri-o://8b9a9683e52a4b4fdd38e56beee2b67b82f9b572fabd3f9ac9676fc328b485e9" gracePeriod=30 Apr 23 18:52:45.929466 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:45.929430 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" podUID="30b922f0-bee6-4b62-8b88-906af09d135e" containerName="kube-rbac-proxy" containerID="cri-o://2bf956d6ebb2d7a3dd3b0ca77a9be0b25f818185cc495f4660f3b13d81d8bed7" gracePeriod=30 Apr 23 18:52:46.207924 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:46.207832 2578 generic.go:358] "Generic (PLEG): container finished" podID="30b922f0-bee6-4b62-8b88-906af09d135e" containerID="2bf956d6ebb2d7a3dd3b0ca77a9be0b25f818185cc495f4660f3b13d81d8bed7" exitCode=2 Apr 23 18:52:46.207924 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:46.207902 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" event={"ID":"30b922f0-bee6-4b62-8b88-906af09d135e","Type":"ContainerDied","Data":"2bf956d6ebb2d7a3dd3b0ca77a9be0b25f818185cc495f4660f3b13d81d8bed7"} Apr 23 18:52:47.965746 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:47.965701 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" podUID="30b922f0-bee6-4b62-8b88-906af09d135e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.45:8643/healthz\": dial tcp 10.133.0.45:8643: connect: connection refused" Apr 23 18:52:47.970065 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:47.970030 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" podUID="30b922f0-bee6-4b62-8b88-906af09d135e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 18:52:49.669776 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:49.669753 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" Apr 23 18:52:49.805708 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:49.805611 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggkj7\" (UniqueName: \"kubernetes.io/projected/30b922f0-bee6-4b62-8b88-906af09d135e-kube-api-access-ggkj7\") pod \"30b922f0-bee6-4b62-8b88-906af09d135e\" (UID: \"30b922f0-bee6-4b62-8b88-906af09d135e\") " Apr 23 18:52:49.805708 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:49.805654 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30b922f0-bee6-4b62-8b88-906af09d135e-kserve-provision-location\") pod \"30b922f0-bee6-4b62-8b88-906af09d135e\" (UID: \"30b922f0-bee6-4b62-8b88-906af09d135e\") " Apr 23 18:52:49.805949 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:49.805717 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30b922f0-bee6-4b62-8b88-906af09d135e-proxy-tls\") pod \"30b922f0-bee6-4b62-8b88-906af09d135e\" (UID: \"30b922f0-bee6-4b62-8b88-906af09d135e\") " Apr 23 18:52:49.805949 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:49.805745 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/30b922f0-bee6-4b62-8b88-906af09d135e-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"30b922f0-bee6-4b62-8b88-906af09d135e\" (UID: \"30b922f0-bee6-4b62-8b88-906af09d135e\") " Apr 23 18:52:49.806096 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:49.806062 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30b922f0-bee6-4b62-8b88-906af09d135e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "30b922f0-bee6-4b62-8b88-906af09d135e" (UID: "30b922f0-bee6-4b62-8b88-906af09d135e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:52:49.806151 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:49.806125 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30b922f0-bee6-4b62-8b88-906af09d135e-isvc-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-kube-rbac-proxy-sar-config") pod "30b922f0-bee6-4b62-8b88-906af09d135e" (UID: "30b922f0-bee6-4b62-8b88-906af09d135e"). InnerVolumeSpecName "isvc-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:52:49.807877 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:49.807849 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30b922f0-bee6-4b62-8b88-906af09d135e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "30b922f0-bee6-4b62-8b88-906af09d135e" (UID: "30b922f0-bee6-4b62-8b88-906af09d135e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:52:49.807993 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:49.807917 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30b922f0-bee6-4b62-8b88-906af09d135e-kube-api-access-ggkj7" (OuterVolumeSpecName: "kube-api-access-ggkj7") pod "30b922f0-bee6-4b62-8b88-906af09d135e" (UID: "30b922f0-bee6-4b62-8b88-906af09d135e"). InnerVolumeSpecName "kube-api-access-ggkj7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:52:49.907206 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:49.907173 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ggkj7\" (UniqueName: \"kubernetes.io/projected/30b922f0-bee6-4b62-8b88-906af09d135e-kube-api-access-ggkj7\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:52:49.907206 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:49.907202 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30b922f0-bee6-4b62-8b88-906af09d135e-kserve-provision-location\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:52:49.907206 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:49.907212 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30b922f0-bee6-4b62-8b88-906af09d135e-proxy-tls\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:52:49.907451 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:49.907223 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/30b922f0-bee6-4b62-8b88-906af09d135e-isvc-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:52:50.223748 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:50.223654 2578 generic.go:358] "Generic (PLEG): container finished" podID="30b922f0-bee6-4b62-8b88-906af09d135e" containerID="8b9a9683e52a4b4fdd38e56beee2b67b82f9b572fabd3f9ac9676fc328b485e9" exitCode=0 Apr 23 18:52:50.223748 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:50.223719 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" event={"ID":"30b922f0-bee6-4b62-8b88-906af09d135e","Type":"ContainerDied","Data":"8b9a9683e52a4b4fdd38e56beee2b67b82f9b572fabd3f9ac9676fc328b485e9"} Apr 23 18:52:50.223748 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:50.223731 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" Apr 23 18:52:50.224014 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:50.223754 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5" event={"ID":"30b922f0-bee6-4b62-8b88-906af09d135e","Type":"ContainerDied","Data":"eddb02ac350f26c455b2e75a8294e74b485510191d0d89aceaa27df2017d94c3"} Apr 23 18:52:50.224014 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:50.223775 2578 scope.go:117] "RemoveContainer" containerID="2bf956d6ebb2d7a3dd3b0ca77a9be0b25f818185cc495f4660f3b13d81d8bed7" Apr 23 18:52:50.232786 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:50.232769 2578 scope.go:117] "RemoveContainer" containerID="8b9a9683e52a4b4fdd38e56beee2b67b82f9b572fabd3f9ac9676fc328b485e9" Apr 23 18:52:50.240197 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:50.240177 2578 scope.go:117] "RemoveContainer" containerID="98b183b27b8192dd1f7c46a69a6b41e5b4f148c08be61e2cb17934c4ab839cbd" Apr 23 18:52:50.245771 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:50.245740 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5"] Apr 23 18:52:50.247921 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:50.247902 2578 scope.go:117] "RemoveContainer" containerID="2bf956d6ebb2d7a3dd3b0ca77a9be0b25f818185cc495f4660f3b13d81d8bed7" Apr 23 18:52:50.248160 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:52:50.248142 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bf956d6ebb2d7a3dd3b0ca77a9be0b25f818185cc495f4660f3b13d81d8bed7\": container with ID starting with 2bf956d6ebb2d7a3dd3b0ca77a9be0b25f818185cc495f4660f3b13d81d8bed7 not found: ID does not exist" containerID="2bf956d6ebb2d7a3dd3b0ca77a9be0b25f818185cc495f4660f3b13d81d8bed7" Apr 23 18:52:50.248206 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:50.248172 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bf956d6ebb2d7a3dd3b0ca77a9be0b25f818185cc495f4660f3b13d81d8bed7"} err="failed to get container status \"2bf956d6ebb2d7a3dd3b0ca77a9be0b25f818185cc495f4660f3b13d81d8bed7\": rpc error: code = NotFound desc = could not find container \"2bf956d6ebb2d7a3dd3b0ca77a9be0b25f818185cc495f4660f3b13d81d8bed7\": container with ID starting with 2bf956d6ebb2d7a3dd3b0ca77a9be0b25f818185cc495f4660f3b13d81d8bed7 not found: ID does not exist" Apr 23 18:52:50.248206 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:50.248190 2578 scope.go:117] "RemoveContainer" containerID="8b9a9683e52a4b4fdd38e56beee2b67b82f9b572fabd3f9ac9676fc328b485e9" Apr 23 18:52:50.248430 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:52:50.248407 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b9a9683e52a4b4fdd38e56beee2b67b82f9b572fabd3f9ac9676fc328b485e9\": container with ID starting with 8b9a9683e52a4b4fdd38e56beee2b67b82f9b572fabd3f9ac9676fc328b485e9 not found: ID does not exist" containerID="8b9a9683e52a4b4fdd38e56beee2b67b82f9b572fabd3f9ac9676fc328b485e9" Apr 23 18:52:50.248430 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:50.248435 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b9a9683e52a4b4fdd38e56beee2b67b82f9b572fabd3f9ac9676fc328b485e9"} err="failed to get container status \"8b9a9683e52a4b4fdd38e56beee2b67b82f9b572fabd3f9ac9676fc328b485e9\": rpc error: code = NotFound desc = could not find container \"8b9a9683e52a4b4fdd38e56beee2b67b82f9b572fabd3f9ac9676fc328b485e9\": container with ID starting with 8b9a9683e52a4b4fdd38e56beee2b67b82f9b572fabd3f9ac9676fc328b485e9 not found: ID does not exist" Apr 23 18:52:50.248748 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:50.248452 2578 scope.go:117] "RemoveContainer" containerID="98b183b27b8192dd1f7c46a69a6b41e5b4f148c08be61e2cb17934c4ab839cbd" Apr 23 18:52:50.248848 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:52:50.248782 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98b183b27b8192dd1f7c46a69a6b41e5b4f148c08be61e2cb17934c4ab839cbd\": container with ID starting with 98b183b27b8192dd1f7c46a69a6b41e5b4f148c08be61e2cb17934c4ab839cbd not found: ID does not exist" containerID="98b183b27b8192dd1f7c46a69a6b41e5b4f148c08be61e2cb17934c4ab839cbd" Apr 23 18:52:50.248848 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:50.248811 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98b183b27b8192dd1f7c46a69a6b41e5b4f148c08be61e2cb17934c4ab839cbd"} err="failed to get container status \"98b183b27b8192dd1f7c46a69a6b41e5b4f148c08be61e2cb17934c4ab839cbd\": rpc error: code = NotFound desc = could not find container \"98b183b27b8192dd1f7c46a69a6b41e5b4f148c08be61e2cb17934c4ab839cbd\": container with ID starting with 98b183b27b8192dd1f7c46a69a6b41e5b4f148c08be61e2cb17934c4ab839cbd not found: ID does not exist" Apr 23 18:52:50.251506 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:50.251485 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-rwfk5"] Apr 23 18:52:50.358051 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:52:50.358016 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30b922f0-bee6-4b62-8b88-906af09d135e" path="/var/lib/kubelet/pods/30b922f0-bee6-4b62-8b88-906af09d135e/volumes" Apr 23 18:53:36.459800 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:53:36.459767 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:53:36.466771 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:53:36.466745 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:53:36.468832 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:53:36.468810 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:53:36.474897 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:53:36.474864 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:54:26.325515 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.325477 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6"] Apr 23 18:54:26.326011 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.325812 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2" containerName="kube-rbac-proxy" Apr 23 18:54:26.326011 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.325824 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2" containerName="kube-rbac-proxy" Apr 23 18:54:26.326011 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.325833 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2" containerName="storage-initializer" Apr 23 18:54:26.326011 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.325839 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2" containerName="storage-initializer" Apr 23 18:54:26.326011 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.325856 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30b922f0-bee6-4b62-8b88-906af09d135e" containerName="kserve-container" Apr 23 18:54:26.326011 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.325861 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b922f0-bee6-4b62-8b88-906af09d135e" containerName="kserve-container" Apr 23 18:54:26.326011 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.325867 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30b922f0-bee6-4b62-8b88-906af09d135e" containerName="kube-rbac-proxy" Apr 23 18:54:26.326011 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.325872 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b922f0-bee6-4b62-8b88-906af09d135e" containerName="kube-rbac-proxy" Apr 23 18:54:26.326011 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.325880 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2" containerName="kserve-container" Apr 23 18:54:26.326011 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.325885 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2" containerName="kserve-container" Apr 23 18:54:26.326011 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.325894 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30b922f0-bee6-4b62-8b88-906af09d135e" containerName="storage-initializer" Apr 23 18:54:26.326011 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.325900 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b922f0-bee6-4b62-8b88-906af09d135e" containerName="storage-initializer" Apr 23 18:54:26.326011 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.325949 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="30b922f0-bee6-4b62-8b88-906af09d135e" containerName="kube-rbac-proxy" Apr 23 18:54:26.326011 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.325958 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2" containerName="kube-rbac-proxy" Apr 23 18:54:26.326011 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.325967 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="30b922f0-bee6-4b62-8b88-906af09d135e" containerName="kserve-container" Apr 23 18:54:26.326011 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.325974 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="5cfcca61-c2aa-4347-83a0-eb1bc98aa3b2" containerName="kserve-container" Apr 23 18:54:26.328878 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.328861 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" Apr 23 18:54:26.331628 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.331592 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 18:54:26.331784 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.331594 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:54:26.331784 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.331601 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-predictor-serving-cert\"" Apr 23 18:54:26.331784 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.331628 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t4fg7\"" Apr 23 18:54:26.332638 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.332611 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\"" Apr 23 18:54:26.342694 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.342653 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6"] Apr 23 18:54:26.418008 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.417970 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a3631085-bc9d-4164-b65d-58e8cfd81ed6-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-qk7k6\" (UID: \"a3631085-bc9d-4164-b65d-58e8cfd81ed6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" Apr 23 18:54:26.418008 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.418010 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3631085-bc9d-4164-b65d-58e8cfd81ed6-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-qk7k6\" (UID: \"a3631085-bc9d-4164-b65d-58e8cfd81ed6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" Apr 23 18:54:26.418247 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.418043 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3631085-bc9d-4164-b65d-58e8cfd81ed6-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-qk7k6\" (UID: \"a3631085-bc9d-4164-b65d-58e8cfd81ed6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" Apr 23 18:54:26.418247 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.418178 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b84r5\" (UniqueName: \"kubernetes.io/projected/a3631085-bc9d-4164-b65d-58e8cfd81ed6-kube-api-access-b84r5\") pod \"isvc-xgboost-runtime-predictor-779db84d9-qk7k6\" (UID: \"a3631085-bc9d-4164-b65d-58e8cfd81ed6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" Apr 23 18:54:26.519410 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.519350 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b84r5\" (UniqueName: \"kubernetes.io/projected/a3631085-bc9d-4164-b65d-58e8cfd81ed6-kube-api-access-b84r5\") pod \"isvc-xgboost-runtime-predictor-779db84d9-qk7k6\" (UID: \"a3631085-bc9d-4164-b65d-58e8cfd81ed6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" Apr 23 18:54:26.519654 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.519446 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a3631085-bc9d-4164-b65d-58e8cfd81ed6-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-qk7k6\" (UID: \"a3631085-bc9d-4164-b65d-58e8cfd81ed6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" Apr 23 18:54:26.519654 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.519471 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3631085-bc9d-4164-b65d-58e8cfd81ed6-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-qk7k6\" (UID: \"a3631085-bc9d-4164-b65d-58e8cfd81ed6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" Apr 23 18:54:26.519654 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.519490 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3631085-bc9d-4164-b65d-58e8cfd81ed6-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-qk7k6\" (UID: \"a3631085-bc9d-4164-b65d-58e8cfd81ed6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" Apr 23 18:54:26.520072 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.520049 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3631085-bc9d-4164-b65d-58e8cfd81ed6-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-qk7k6\" (UID: \"a3631085-bc9d-4164-b65d-58e8cfd81ed6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" Apr 23 18:54:26.520254 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.520229 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a3631085-bc9d-4164-b65d-58e8cfd81ed6-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-qk7k6\" (UID: \"a3631085-bc9d-4164-b65d-58e8cfd81ed6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" Apr 23 18:54:26.522027 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.522007 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3631085-bc9d-4164-b65d-58e8cfd81ed6-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-qk7k6\" (UID: \"a3631085-bc9d-4164-b65d-58e8cfd81ed6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" Apr 23 18:54:26.527407 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.527386 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b84r5\" (UniqueName: \"kubernetes.io/projected/a3631085-bc9d-4164-b65d-58e8cfd81ed6-kube-api-access-b84r5\") pod \"isvc-xgboost-runtime-predictor-779db84d9-qk7k6\" (UID: \"a3631085-bc9d-4164-b65d-58e8cfd81ed6\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" Apr 23 18:54:26.644346 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.644249 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" Apr 23 18:54:26.766476 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.766253 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6"] Apr 23 18:54:26.769393 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:54:26.769360 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3631085_bc9d_4164_b65d_58e8cfd81ed6.slice/crio-41f43c99d910954218ad7cccbd8547c4c1a0aa6638967264353fd670b8231b90 WatchSource:0}: Error finding container 41f43c99d910954218ad7cccbd8547c4c1a0aa6638967264353fd670b8231b90: Status 404 returned error can't find the container with id 41f43c99d910954218ad7cccbd8547c4c1a0aa6638967264353fd670b8231b90 Apr 23 18:54:26.771261 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:26.771245 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:54:27.551771 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:27.551731 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" event={"ID":"a3631085-bc9d-4164-b65d-58e8cfd81ed6","Type":"ContainerStarted","Data":"13be118b91e5a9abda3a7c4e5eb7df600d6ede96707116e89dd482437e9bd636"} Apr 23 18:54:27.551771 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:27.551776 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" event={"ID":"a3631085-bc9d-4164-b65d-58e8cfd81ed6","Type":"ContainerStarted","Data":"41f43c99d910954218ad7cccbd8547c4c1a0aa6638967264353fd670b8231b90"} Apr 23 18:54:31.566272 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:31.566235 2578 generic.go:358] "Generic (PLEG): container finished" podID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerID="13be118b91e5a9abda3a7c4e5eb7df600d6ede96707116e89dd482437e9bd636" exitCode=0 Apr 23 18:54:31.566794 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:31.566303 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" event={"ID":"a3631085-bc9d-4164-b65d-58e8cfd81ed6","Type":"ContainerDied","Data":"13be118b91e5a9abda3a7c4e5eb7df600d6ede96707116e89dd482437e9bd636"} Apr 23 18:54:32.571598 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:32.571561 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" event={"ID":"a3631085-bc9d-4164-b65d-58e8cfd81ed6","Type":"ContainerStarted","Data":"c03526563585179c851f6c7e50e932759c5eacb29d4773b69824434a767471a0"} Apr 23 18:54:32.571598 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:32.571603 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" event={"ID":"a3631085-bc9d-4164-b65d-58e8cfd81ed6","Type":"ContainerStarted","Data":"ab0282cbec5118aca392492a2a7b508e57add7b3c4ef2ad4c28d23c854ae774c"} Apr 23 18:54:32.572123 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:32.571918 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" Apr 23 18:54:32.572123 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:32.572047 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" Apr 23 18:54:32.573595 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:32.573569 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" podUID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:54:32.591523 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:32.591471 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" podStartSLOduration=6.5914554370000005 podStartE2EDuration="6.591455437s" podCreationTimestamp="2026-04-23 18:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:54:32.58910978 +0000 UTC m=+3356.822851696" watchObservedRunningTime="2026-04-23 18:54:32.591455437 +0000 UTC m=+3356.825197353" Apr 23 18:54:33.575301 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:33.575265 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" podUID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:54:38.579645 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:38.579609 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" Apr 23 18:54:38.580211 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:38.580184 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" podUID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:54:48.580209 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:48.580163 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" podUID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:54:58.580658 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:54:58.580616 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" podUID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:55:08.581160 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:08.581122 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" podUID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:55:18.580214 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:18.580172 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" podUID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:55:28.580442 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:28.580355 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" podUID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:55:38.580695 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:38.580661 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" Apr 23 18:55:46.443313 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:46.443264 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6"] Apr 23 18:55:46.443868 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:46.443602 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" podUID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerName="kserve-container" containerID="cri-o://ab0282cbec5118aca392492a2a7b508e57add7b3c4ef2ad4c28d23c854ae774c" gracePeriod=30 Apr 23 18:55:46.443868 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:46.443674 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" podUID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerName="kube-rbac-proxy" containerID="cri-o://c03526563585179c851f6c7e50e932759c5eacb29d4773b69824434a767471a0" gracePeriod=30 Apr 23 18:55:46.819630 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:46.819592 2578 generic.go:358] "Generic (PLEG): container finished" podID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerID="c03526563585179c851f6c7e50e932759c5eacb29d4773b69824434a767471a0" exitCode=2 Apr 23 18:55:46.819818 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:46.819661 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" event={"ID":"a3631085-bc9d-4164-b65d-58e8cfd81ed6","Type":"ContainerDied","Data":"c03526563585179c851f6c7e50e932759c5eacb29d4773b69824434a767471a0"} Apr 23 18:55:48.576306 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:48.576260 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" podUID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.46:8643/healthz\": dial tcp 10.133.0.46:8643: connect: connection refused" Apr 23 18:55:48.580731 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:48.580695 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" podUID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 18:55:50.280493 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.280470 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" Apr 23 18:55:50.350785 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.350740 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3631085-bc9d-4164-b65d-58e8cfd81ed6-proxy-tls\") pod \"a3631085-bc9d-4164-b65d-58e8cfd81ed6\" (UID: \"a3631085-bc9d-4164-b65d-58e8cfd81ed6\") " Apr 23 18:55:50.350785 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.350782 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b84r5\" (UniqueName: \"kubernetes.io/projected/a3631085-bc9d-4164-b65d-58e8cfd81ed6-kube-api-access-b84r5\") pod \"a3631085-bc9d-4164-b65d-58e8cfd81ed6\" (UID: \"a3631085-bc9d-4164-b65d-58e8cfd81ed6\") " Apr 23 18:55:50.351063 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.350894 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a3631085-bc9d-4164-b65d-58e8cfd81ed6-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"a3631085-bc9d-4164-b65d-58e8cfd81ed6\" (UID: \"a3631085-bc9d-4164-b65d-58e8cfd81ed6\") " Apr 23 18:55:50.351063 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.350961 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3631085-bc9d-4164-b65d-58e8cfd81ed6-kserve-provision-location\") pod \"a3631085-bc9d-4164-b65d-58e8cfd81ed6\" (UID: \"a3631085-bc9d-4164-b65d-58e8cfd81ed6\") " Apr 23 18:55:50.351269 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.351243 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3631085-bc9d-4164-b65d-58e8cfd81ed6-isvc-xgboost-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-runtime-kube-rbac-proxy-sar-config") pod "a3631085-bc9d-4164-b65d-58e8cfd81ed6" (UID: "a3631085-bc9d-4164-b65d-58e8cfd81ed6"). InnerVolumeSpecName "isvc-xgboost-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:55:50.351329 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.351264 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3631085-bc9d-4164-b65d-58e8cfd81ed6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a3631085-bc9d-4164-b65d-58e8cfd81ed6" (UID: "a3631085-bc9d-4164-b65d-58e8cfd81ed6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:55:50.352955 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.352919 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3631085-bc9d-4164-b65d-58e8cfd81ed6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a3631085-bc9d-4164-b65d-58e8cfd81ed6" (UID: "a3631085-bc9d-4164-b65d-58e8cfd81ed6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:55:50.353062 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.353019 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3631085-bc9d-4164-b65d-58e8cfd81ed6-kube-api-access-b84r5" (OuterVolumeSpecName: "kube-api-access-b84r5") pod "a3631085-bc9d-4164-b65d-58e8cfd81ed6" (UID: "a3631085-bc9d-4164-b65d-58e8cfd81ed6"). InnerVolumeSpecName "kube-api-access-b84r5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:55:50.452123 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.452083 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a3631085-bc9d-4164-b65d-58e8cfd81ed6-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:55:50.452123 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.452116 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3631085-bc9d-4164-b65d-58e8cfd81ed6-kserve-provision-location\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:55:50.452123 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.452128 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3631085-bc9d-4164-b65d-58e8cfd81ed6-proxy-tls\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:55:50.452350 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.452139 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b84r5\" (UniqueName: \"kubernetes.io/projected/a3631085-bc9d-4164-b65d-58e8cfd81ed6-kube-api-access-b84r5\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:55:50.834363 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.834332 2578 generic.go:358] "Generic (PLEG): container finished" podID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerID="ab0282cbec5118aca392492a2a7b508e57add7b3c4ef2ad4c28d23c854ae774c" exitCode=0 Apr 23 18:55:50.834573 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.834433 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" Apr 23 18:55:50.834573 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.834429 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" event={"ID":"a3631085-bc9d-4164-b65d-58e8cfd81ed6","Type":"ContainerDied","Data":"ab0282cbec5118aca392492a2a7b508e57add7b3c4ef2ad4c28d23c854ae774c"} Apr 23 18:55:50.834573 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.834565 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6" event={"ID":"a3631085-bc9d-4164-b65d-58e8cfd81ed6","Type":"ContainerDied","Data":"41f43c99d910954218ad7cccbd8547c4c1a0aa6638967264353fd670b8231b90"} Apr 23 18:55:50.834760 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.834582 2578 scope.go:117] "RemoveContainer" containerID="c03526563585179c851f6c7e50e932759c5eacb29d4773b69824434a767471a0" Apr 23 18:55:50.844684 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.844633 2578 scope.go:117] "RemoveContainer" containerID="ab0282cbec5118aca392492a2a7b508e57add7b3c4ef2ad4c28d23c854ae774c" Apr 23 18:55:50.852205 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.852174 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6"] Apr 23 18:55:50.853704 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.853608 2578 scope.go:117] "RemoveContainer" containerID="13be118b91e5a9abda3a7c4e5eb7df600d6ede96707116e89dd482437e9bd636" Apr 23 18:55:50.855946 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.855926 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-qk7k6"] Apr 23 18:55:50.861679 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.861659 2578 scope.go:117] "RemoveContainer" containerID="c03526563585179c851f6c7e50e932759c5eacb29d4773b69824434a767471a0" Apr 23 18:55:50.861979 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:55:50.861957 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c03526563585179c851f6c7e50e932759c5eacb29d4773b69824434a767471a0\": container with ID starting with c03526563585179c851f6c7e50e932759c5eacb29d4773b69824434a767471a0 not found: ID does not exist" containerID="c03526563585179c851f6c7e50e932759c5eacb29d4773b69824434a767471a0" Apr 23 18:55:50.862073 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.861987 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c03526563585179c851f6c7e50e932759c5eacb29d4773b69824434a767471a0"} err="failed to get container status \"c03526563585179c851f6c7e50e932759c5eacb29d4773b69824434a767471a0\": rpc error: code = NotFound desc = could not find container \"c03526563585179c851f6c7e50e932759c5eacb29d4773b69824434a767471a0\": container with ID starting with c03526563585179c851f6c7e50e932759c5eacb29d4773b69824434a767471a0 not found: ID does not exist" Apr 23 18:55:50.862073 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.862010 2578 scope.go:117] "RemoveContainer" containerID="ab0282cbec5118aca392492a2a7b508e57add7b3c4ef2ad4c28d23c854ae774c" Apr 23 18:55:50.862276 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:55:50.862257 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab0282cbec5118aca392492a2a7b508e57add7b3c4ef2ad4c28d23c854ae774c\": container with ID starting with ab0282cbec5118aca392492a2a7b508e57add7b3c4ef2ad4c28d23c854ae774c not found: ID does not exist" containerID="ab0282cbec5118aca392492a2a7b508e57add7b3c4ef2ad4c28d23c854ae774c" Apr 23 18:55:50.862323 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.862284 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab0282cbec5118aca392492a2a7b508e57add7b3c4ef2ad4c28d23c854ae774c"} err="failed to get container status \"ab0282cbec5118aca392492a2a7b508e57add7b3c4ef2ad4c28d23c854ae774c\": rpc error: code = NotFound desc = could not find container \"ab0282cbec5118aca392492a2a7b508e57add7b3c4ef2ad4c28d23c854ae774c\": container with ID starting with ab0282cbec5118aca392492a2a7b508e57add7b3c4ef2ad4c28d23c854ae774c not found: ID does not exist" Apr 23 18:55:50.862323 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.862301 2578 scope.go:117] "RemoveContainer" containerID="13be118b91e5a9abda3a7c4e5eb7df600d6ede96707116e89dd482437e9bd636" Apr 23 18:55:50.862564 ip-10-0-143-63 kubenswrapper[2578]: E0423 18:55:50.862524 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13be118b91e5a9abda3a7c4e5eb7df600d6ede96707116e89dd482437e9bd636\": container with ID starting with 13be118b91e5a9abda3a7c4e5eb7df600d6ede96707116e89dd482437e9bd636 not found: ID does not exist" containerID="13be118b91e5a9abda3a7c4e5eb7df600d6ede96707116e89dd482437e9bd636" Apr 23 18:55:50.862564 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:50.862561 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13be118b91e5a9abda3a7c4e5eb7df600d6ede96707116e89dd482437e9bd636"} err="failed to get container status \"13be118b91e5a9abda3a7c4e5eb7df600d6ede96707116e89dd482437e9bd636\": rpc error: code = NotFound desc = could not find container \"13be118b91e5a9abda3a7c4e5eb7df600d6ede96707116e89dd482437e9bd636\": container with ID starting with 13be118b91e5a9abda3a7c4e5eb7df600d6ede96707116e89dd482437e9bd636 not found: ID does not exist" Apr 23 18:55:52.357446 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:55:52.357414 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" path="/var/lib/kubelet/pods/a3631085-bc9d-4164-b65d-58e8cfd81ed6/volumes" Apr 23 18:56:46.712865 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.712829 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62"] Apr 23 18:56:46.713318 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.713152 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerName="storage-initializer" Apr 23 18:56:46.713318 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.713163 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerName="storage-initializer" Apr 23 18:56:46.713318 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.713176 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerName="kserve-container" Apr 23 18:56:46.713318 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.713182 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerName="kserve-container" Apr 23 18:56:46.713318 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.713189 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerName="kube-rbac-proxy" Apr 23 18:56:46.713318 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.713194 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerName="kube-rbac-proxy" Apr 23 18:56:46.713318 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.713253 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerName="kube-rbac-proxy" Apr 23 18:56:46.713318 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.713261 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3631085-bc9d-4164-b65d-58e8cfd81ed6" containerName="kserve-container" Apr 23 18:56:46.716437 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.716415 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" Apr 23 18:56:46.718383 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.718361 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-predictor-serving-cert\"" Apr 23 18:56:46.718570 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.718399 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:56:46.718570 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.718417 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 23 18:56:46.719308 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.719289 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 18:56:46.719418 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.719324 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t4fg7\"" Apr 23 18:56:46.725822 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.725797 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62"] Apr 23 18:56:46.815000 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.814964 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fdd399a8-e13f-4783-ae77-321150364765-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-46k62\" (UID: \"fdd399a8-e13f-4783-ae77-321150364765\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" Apr 23 18:56:46.815224 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.815032 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fdd399a8-e13f-4783-ae77-321150364765-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-46k62\" (UID: \"fdd399a8-e13f-4783-ae77-321150364765\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" Apr 23 18:56:46.815224 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.815073 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9zxx\" (UniqueName: \"kubernetes.io/projected/fdd399a8-e13f-4783-ae77-321150364765-kube-api-access-r9zxx\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-46k62\" (UID: \"fdd399a8-e13f-4783-ae77-321150364765\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" Apr 23 18:56:46.815224 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.815142 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fdd399a8-e13f-4783-ae77-321150364765-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-46k62\" (UID: \"fdd399a8-e13f-4783-ae77-321150364765\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" Apr 23 18:56:46.916710 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.916664 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fdd399a8-e13f-4783-ae77-321150364765-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-46k62\" (UID: \"fdd399a8-e13f-4783-ae77-321150364765\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" Apr 23 18:56:46.916890 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.916735 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fdd399a8-e13f-4783-ae77-321150364765-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-46k62\" (UID: \"fdd399a8-e13f-4783-ae77-321150364765\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" Apr 23 18:56:46.916890 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.916777 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9zxx\" (UniqueName: \"kubernetes.io/projected/fdd399a8-e13f-4783-ae77-321150364765-kube-api-access-r9zxx\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-46k62\" (UID: \"fdd399a8-e13f-4783-ae77-321150364765\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" Apr 23 18:56:46.916890 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.916831 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fdd399a8-e13f-4783-ae77-321150364765-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-46k62\" (UID: \"fdd399a8-e13f-4783-ae77-321150364765\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" Apr 23 18:56:46.917157 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.917125 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fdd399a8-e13f-4783-ae77-321150364765-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-46k62\" (UID: \"fdd399a8-e13f-4783-ae77-321150364765\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" Apr 23 18:56:46.917606 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.917582 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fdd399a8-e13f-4783-ae77-321150364765-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-46k62\" (UID: \"fdd399a8-e13f-4783-ae77-321150364765\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" Apr 23 18:56:46.919312 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.919287 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fdd399a8-e13f-4783-ae77-321150364765-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-46k62\" (UID: \"fdd399a8-e13f-4783-ae77-321150364765\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" Apr 23 18:56:46.924359 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:46.924331 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9zxx\" (UniqueName: \"kubernetes.io/projected/fdd399a8-e13f-4783-ae77-321150364765-kube-api-access-r9zxx\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-46k62\" (UID: \"fdd399a8-e13f-4783-ae77-321150364765\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" Apr 23 18:56:47.027514 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:47.027487 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" Apr 23 18:56:47.154335 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:47.154302 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62"] Apr 23 18:56:47.157465 ip-10-0-143-63 kubenswrapper[2578]: W0423 18:56:47.157435 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdd399a8_e13f_4783_ae77_321150364765.slice/crio-7c81421923de0a04c888229f4962801832fac576ec593a21f2d3b9a0c297d757 WatchSource:0}: Error finding container 7c81421923de0a04c888229f4962801832fac576ec593a21f2d3b9a0c297d757: Status 404 returned error can't find the container with id 7c81421923de0a04c888229f4962801832fac576ec593a21f2d3b9a0c297d757 Apr 23 18:56:48.023665 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:48.023621 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" event={"ID":"fdd399a8-e13f-4783-ae77-321150364765","Type":"ContainerStarted","Data":"404b764cb438d3a20ea36b718822dcf40c4f337a2861754a35cb1af0e96193a2"} Apr 23 18:56:48.023665 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:48.023667 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" event={"ID":"fdd399a8-e13f-4783-ae77-321150364765","Type":"ContainerStarted","Data":"7c81421923de0a04c888229f4962801832fac576ec593a21f2d3b9a0c297d757"} Apr 23 18:56:51.034034 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:51.033995 2578 generic.go:358] "Generic (PLEG): container finished" podID="fdd399a8-e13f-4783-ae77-321150364765" containerID="404b764cb438d3a20ea36b718822dcf40c4f337a2861754a35cb1af0e96193a2" exitCode=0 Apr 23 18:56:51.034446 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:51.034069 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" event={"ID":"fdd399a8-e13f-4783-ae77-321150364765","Type":"ContainerDied","Data":"404b764cb438d3a20ea36b718822dcf40c4f337a2861754a35cb1af0e96193a2"} Apr 23 18:56:52.038937 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:52.038901 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" event={"ID":"fdd399a8-e13f-4783-ae77-321150364765","Type":"ContainerStarted","Data":"9f5ca1311b50265a66ab30f0c699eda342d69a359aedad034753fe58e1c8999b"} Apr 23 18:56:52.038937 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:52.038942 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" event={"ID":"fdd399a8-e13f-4783-ae77-321150364765","Type":"ContainerStarted","Data":"b945be6882949885af4ee26b52351a1405f7c3ede9b5418cf517a8064541a433"} Apr 23 18:56:52.039422 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:52.039240 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" Apr 23 18:56:52.039422 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:52.039353 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" Apr 23 18:56:52.040440 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:52.040418 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" podUID="fdd399a8-e13f-4783-ae77-321150364765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 18:56:52.058495 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:52.058445 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" podStartSLOduration=6.058429865 podStartE2EDuration="6.058429865s" podCreationTimestamp="2026-04-23 18:56:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:56:52.057294999 +0000 UTC m=+3496.291036917" watchObservedRunningTime="2026-04-23 18:56:52.058429865 +0000 UTC m=+3496.292171781" Apr 23 18:56:53.042719 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:53.042672 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" podUID="fdd399a8-e13f-4783-ae77-321150364765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 18:56:58.048183 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:58.048107 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" Apr 23 18:56:58.048704 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:56:58.048676 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" podUID="fdd399a8-e13f-4783-ae77-321150364765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 18:57:08.049179 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:57:08.049130 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" podUID="fdd399a8-e13f-4783-ae77-321150364765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 18:57:18.048799 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:57:18.048760 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" podUID="fdd399a8-e13f-4783-ae77-321150364765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 18:57:28.049686 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:57:28.049634 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" podUID="fdd399a8-e13f-4783-ae77-321150364765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 18:57:38.049064 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:57:38.049023 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" podUID="fdd399a8-e13f-4783-ae77-321150364765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 18:57:48.049395 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:57:48.049350 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" podUID="fdd399a8-e13f-4783-ae77-321150364765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 18:57:58.049697 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:57:58.049667 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" Apr 23 18:58:06.823862 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:06.823826 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62"] Apr 23 18:58:06.824469 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:06.824183 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" podUID="fdd399a8-e13f-4783-ae77-321150364765" containerName="kserve-container" containerID="cri-o://b945be6882949885af4ee26b52351a1405f7c3ede9b5418cf517a8064541a433" gracePeriod=30 Apr 23 18:58:06.824469 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:06.824207 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" podUID="fdd399a8-e13f-4783-ae77-321150364765" containerName="kube-rbac-proxy" containerID="cri-o://9f5ca1311b50265a66ab30f0c699eda342d69a359aedad034753fe58e1c8999b" gracePeriod=30 Apr 23 18:58:07.297094 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:07.297058 2578 generic.go:358] "Generic (PLEG): container finished" podID="fdd399a8-e13f-4783-ae77-321150364765" containerID="9f5ca1311b50265a66ab30f0c699eda342d69a359aedad034753fe58e1c8999b" exitCode=2 Apr 23 18:58:07.297280 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:07.297119 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" event={"ID":"fdd399a8-e13f-4783-ae77-321150364765","Type":"ContainerDied","Data":"9f5ca1311b50265a66ab30f0c699eda342d69a359aedad034753fe58e1c8999b"} Apr 23 18:58:08.043707 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:08.043655 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" podUID="fdd399a8-e13f-4783-ae77-321150364765" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.47:8643/healthz\": dial tcp 10.133.0.47:8643: connect: connection refused" Apr 23 18:58:08.049179 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:08.049148 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" podUID="fdd399a8-e13f-4783-ae77-321150364765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 18:58:10.310201 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:10.310074 2578 generic.go:358] "Generic (PLEG): container finished" podID="fdd399a8-e13f-4783-ae77-321150364765" containerID="b945be6882949885af4ee26b52351a1405f7c3ede9b5418cf517a8064541a433" exitCode=0 Apr 23 18:58:10.310201 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:10.310153 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" event={"ID":"fdd399a8-e13f-4783-ae77-321150364765","Type":"ContainerDied","Data":"b945be6882949885af4ee26b52351a1405f7c3ede9b5418cf517a8064541a433"} Apr 23 18:58:10.371914 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:10.371892 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" Apr 23 18:58:10.401742 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:10.401710 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fdd399a8-e13f-4783-ae77-321150364765-kserve-provision-location\") pod \"fdd399a8-e13f-4783-ae77-321150364765\" (UID: \"fdd399a8-e13f-4783-ae77-321150364765\") " Apr 23 18:58:10.401887 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:10.401765 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fdd399a8-e13f-4783-ae77-321150364765-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"fdd399a8-e13f-4783-ae77-321150364765\" (UID: \"fdd399a8-e13f-4783-ae77-321150364765\") " Apr 23 18:58:10.401887 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:10.401829 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fdd399a8-e13f-4783-ae77-321150364765-proxy-tls\") pod \"fdd399a8-e13f-4783-ae77-321150364765\" (UID: \"fdd399a8-e13f-4783-ae77-321150364765\") " Apr 23 18:58:10.401887 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:10.401852 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9zxx\" (UniqueName: \"kubernetes.io/projected/fdd399a8-e13f-4783-ae77-321150364765-kube-api-access-r9zxx\") pod \"fdd399a8-e13f-4783-ae77-321150364765\" (UID: \"fdd399a8-e13f-4783-ae77-321150364765\") " Apr 23 18:58:10.402095 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:10.402070 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd399a8-e13f-4783-ae77-321150364765-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fdd399a8-e13f-4783-ae77-321150364765" (UID: "fdd399a8-e13f-4783-ae77-321150364765"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:58:10.402154 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:10.402127 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdd399a8-e13f-4783-ae77-321150364765-isvc-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-kube-rbac-proxy-sar-config") pod "fdd399a8-e13f-4783-ae77-321150364765" (UID: "fdd399a8-e13f-4783-ae77-321150364765"). InnerVolumeSpecName "isvc-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:58:10.403881 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:10.403857 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdd399a8-e13f-4783-ae77-321150364765-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fdd399a8-e13f-4783-ae77-321150364765" (UID: "fdd399a8-e13f-4783-ae77-321150364765"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:58:10.403980 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:10.403919 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdd399a8-e13f-4783-ae77-321150364765-kube-api-access-r9zxx" (OuterVolumeSpecName: "kube-api-access-r9zxx") pod "fdd399a8-e13f-4783-ae77-321150364765" (UID: "fdd399a8-e13f-4783-ae77-321150364765"). InnerVolumeSpecName "kube-api-access-r9zxx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:58:10.503304 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:10.503278 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fdd399a8-e13f-4783-ae77-321150364765-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:58:10.503304 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:10.503308 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fdd399a8-e13f-4783-ae77-321150364765-proxy-tls\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:58:10.503500 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:10.503321 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r9zxx\" (UniqueName: \"kubernetes.io/projected/fdd399a8-e13f-4783-ae77-321150364765-kube-api-access-r9zxx\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:58:10.503500 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:10.503331 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fdd399a8-e13f-4783-ae77-321150364765-kserve-provision-location\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 18:58:11.314825 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:11.314798 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" Apr 23 18:58:11.314825 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:11.314804 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62" event={"ID":"fdd399a8-e13f-4783-ae77-321150364765","Type":"ContainerDied","Data":"7c81421923de0a04c888229f4962801832fac576ec593a21f2d3b9a0c297d757"} Apr 23 18:58:11.315272 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:11.314857 2578 scope.go:117] "RemoveContainer" containerID="9f5ca1311b50265a66ab30f0c699eda342d69a359aedad034753fe58e1c8999b" Apr 23 18:58:11.323451 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:11.323387 2578 scope.go:117] "RemoveContainer" containerID="b945be6882949885af4ee26b52351a1405f7c3ede9b5418cf517a8064541a433" Apr 23 18:58:11.331558 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:11.331524 2578 scope.go:117] "RemoveContainer" containerID="404b764cb438d3a20ea36b718822dcf40c4f337a2861754a35cb1af0e96193a2" Apr 23 18:58:11.339474 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:11.339425 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62"] Apr 23 18:58:11.344735 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:11.344713 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-46k62"] Apr 23 18:58:12.357305 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:12.357272 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdd399a8-e13f-4783-ae77-321150364765" path="/var/lib/kubelet/pods/fdd399a8-e13f-4783-ae77-321150364765/volumes" Apr 23 18:58:36.481973 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:36.481923 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:58:36.488687 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:36.488656 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 18:58:36.491641 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:36.491616 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 18:58:36.497606 ip-10-0-143-63 kubenswrapper[2578]: I0423 18:58:36.497582 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 19:03:36.504649 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:36.504618 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 19:03:36.511399 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:36.511375 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 19:03:36.514222 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:36.514202 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 19:03:36.520076 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:36.520058 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 19:03:54.536645 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:54.536602 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4sfj8/must-gather-npl86"] Apr 23 19:03:54.537292 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:54.537075 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fdd399a8-e13f-4783-ae77-321150364765" containerName="storage-initializer" Apr 23 19:03:54.537292 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:54.537096 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd399a8-e13f-4783-ae77-321150364765" containerName="storage-initializer" Apr 23 19:03:54.537292 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:54.537109 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fdd399a8-e13f-4783-ae77-321150364765" containerName="kserve-container" Apr 23 19:03:54.537292 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:54.537119 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd399a8-e13f-4783-ae77-321150364765" containerName="kserve-container" Apr 23 19:03:54.537292 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:54.537149 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fdd399a8-e13f-4783-ae77-321150364765" containerName="kube-rbac-proxy" Apr 23 19:03:54.537292 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:54.537157 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd399a8-e13f-4783-ae77-321150364765" containerName="kube-rbac-proxy" Apr 23 19:03:54.537292 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:54.537243 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fdd399a8-e13f-4783-ae77-321150364765" containerName="kserve-container" Apr 23 19:03:54.537292 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:54.537262 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fdd399a8-e13f-4783-ae77-321150364765" containerName="kube-rbac-proxy" Apr 23 19:03:54.540514 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:54.540486 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4sfj8/must-gather-npl86" Apr 23 19:03:54.542410 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:54.542374 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4sfj8\"/\"openshift-service-ca.crt\"" Apr 23 19:03:54.542577 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:54.542475 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-4sfj8\"/\"default-dockercfg-kddfp\"" Apr 23 19:03:54.542577 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:54.542551 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4sfj8\"/\"kube-root-ca.crt\"" Apr 23 19:03:54.547135 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:54.547099 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4sfj8/must-gather-npl86"] Apr 23 19:03:54.681724 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:54.681685 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a4537785-00f7-40bb-b4d1-06b1b0f4da28-must-gather-output\") pod \"must-gather-npl86\" (UID: \"a4537785-00f7-40bb-b4d1-06b1b0f4da28\") " pod="openshift-must-gather-4sfj8/must-gather-npl86" Apr 23 19:03:54.681908 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:54.681748 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkd9p\" (UniqueName: \"kubernetes.io/projected/a4537785-00f7-40bb-b4d1-06b1b0f4da28-kube-api-access-rkd9p\") pod \"must-gather-npl86\" (UID: \"a4537785-00f7-40bb-b4d1-06b1b0f4da28\") " pod="openshift-must-gather-4sfj8/must-gather-npl86" Apr 23 19:03:54.782397 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:54.782357 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkd9p\" (UniqueName: \"kubernetes.io/projected/a4537785-00f7-40bb-b4d1-06b1b0f4da28-kube-api-access-rkd9p\") pod \"must-gather-npl86\" (UID: \"a4537785-00f7-40bb-b4d1-06b1b0f4da28\") " pod="openshift-must-gather-4sfj8/must-gather-npl86" Apr 23 19:03:54.782632 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:54.782446 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a4537785-00f7-40bb-b4d1-06b1b0f4da28-must-gather-output\") pod \"must-gather-npl86\" (UID: \"a4537785-00f7-40bb-b4d1-06b1b0f4da28\") " pod="openshift-must-gather-4sfj8/must-gather-npl86" Apr 23 19:03:54.782835 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:54.782813 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a4537785-00f7-40bb-b4d1-06b1b0f4da28-must-gather-output\") pod \"must-gather-npl86\" (UID: \"a4537785-00f7-40bb-b4d1-06b1b0f4da28\") " pod="openshift-must-gather-4sfj8/must-gather-npl86" Apr 23 19:03:54.791715 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:54.791645 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkd9p\" (UniqueName: \"kubernetes.io/projected/a4537785-00f7-40bb-b4d1-06b1b0f4da28-kube-api-access-rkd9p\") pod \"must-gather-npl86\" (UID: \"a4537785-00f7-40bb-b4d1-06b1b0f4da28\") " pod="openshift-must-gather-4sfj8/must-gather-npl86" Apr 23 19:03:54.875615 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:54.875575 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4sfj8/must-gather-npl86" Apr 23 19:03:55.002683 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:55.002645 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4sfj8/must-gather-npl86"] Apr 23 19:03:55.005685 ip-10-0-143-63 kubenswrapper[2578]: W0423 19:03:55.005652 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4537785_00f7_40bb_b4d1_06b1b0f4da28.slice/crio-d7baf3b8a99e8b143482faa1f108c1c5aeb9ba396627b729eb70992d456fbe4d WatchSource:0}: Error finding container d7baf3b8a99e8b143482faa1f108c1c5aeb9ba396627b729eb70992d456fbe4d: Status 404 returned error can't find the container with id d7baf3b8a99e8b143482faa1f108c1c5aeb9ba396627b729eb70992d456fbe4d Apr 23 19:03:55.007340 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:55.007324 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 19:03:55.453726 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:03:55.453689 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4sfj8/must-gather-npl86" event={"ID":"a4537785-00f7-40bb-b4d1-06b1b0f4da28","Type":"ContainerStarted","Data":"d7baf3b8a99e8b143482faa1f108c1c5aeb9ba396627b729eb70992d456fbe4d"} Apr 23 19:04:00.473596 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:00.473553 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4sfj8/must-gather-npl86" event={"ID":"a4537785-00f7-40bb-b4d1-06b1b0f4da28","Type":"ContainerStarted","Data":"54a34730aad2701a460ef3e440909c859644e78e2f16cb603722c8a0769938e5"} Apr 23 19:04:00.474064 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:00.473604 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4sfj8/must-gather-npl86" event={"ID":"a4537785-00f7-40bb-b4d1-06b1b0f4da28","Type":"ContainerStarted","Data":"94d34ae2ff9b45932e34a5e2123250ecd726e1fc35b3340d7931f4efac18ac09"} Apr 23 19:04:00.490132 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:00.490079 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4sfj8/must-gather-npl86" podStartSLOduration=2.000948195 podStartE2EDuration="6.490061001s" podCreationTimestamp="2026-04-23 19:03:54 +0000 UTC" firstStartedPulling="2026-04-23 19:03:55.007476064 +0000 UTC m=+3919.241217958" lastFinishedPulling="2026-04-23 19:03:59.49658887 +0000 UTC m=+3923.730330764" observedRunningTime="2026-04-23 19:04:00.488327572 +0000 UTC m=+3924.722069484" watchObservedRunningTime="2026-04-23 19:04:00.490061001 +0000 UTC m=+3924.723802916" Apr 23 19:04:23.554720 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:23.554683 2578 generic.go:358] "Generic (PLEG): container finished" podID="a4537785-00f7-40bb-b4d1-06b1b0f4da28" containerID="94d34ae2ff9b45932e34a5e2123250ecd726e1fc35b3340d7931f4efac18ac09" exitCode=0 Apr 23 19:04:23.555311 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:23.554766 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4sfj8/must-gather-npl86" event={"ID":"a4537785-00f7-40bb-b4d1-06b1b0f4da28","Type":"ContainerDied","Data":"94d34ae2ff9b45932e34a5e2123250ecd726e1fc35b3340d7931f4efac18ac09"} Apr 23 19:04:23.555311 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:23.555197 2578 scope.go:117] "RemoveContainer" containerID="94d34ae2ff9b45932e34a5e2123250ecd726e1fc35b3340d7931f4efac18ac09" Apr 23 19:04:23.818642 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:23.818521 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4sfj8_must-gather-npl86_a4537785-00f7-40bb-b4d1-06b1b0f4da28/gather/0.log" Apr 23 19:04:27.550396 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:27.550310 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-wm4q6_c32531ab-73c0-4407-990e-7be86cd675cc/global-pull-secret-syncer/0.log" Apr 23 19:04:27.713386 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:27.713349 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-qcp2q_94dfcde8-e6d8-4b6b-825e-40bb5305f5ef/konnectivity-agent/0.log" Apr 23 19:04:27.788393 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:27.788356 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-63.ec2.internal_858e9dbb3f829b7de0afbb8a36ca323c/haproxy/0.log" Apr 23 19:04:29.339482 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.339438 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4sfj8/must-gather-npl86"] Apr 23 19:04:29.339938 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.339683 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-4sfj8/must-gather-npl86" podUID="a4537785-00f7-40bb-b4d1-06b1b0f4da28" containerName="copy" containerID="cri-o://54a34730aad2701a460ef3e440909c859644e78e2f16cb603722c8a0769938e5" gracePeriod=2 Apr 23 19:04:29.345818 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.345786 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4sfj8/must-gather-npl86"] Apr 23 19:04:29.572603 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.572578 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4sfj8_must-gather-npl86_a4537785-00f7-40bb-b4d1-06b1b0f4da28/copy/0.log" Apr 23 19:04:29.572979 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.572959 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4sfj8/must-gather-npl86" Apr 23 19:04:29.574750 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.574709 2578 status_manager.go:895] "Failed to get status for pod" podUID="a4537785-00f7-40bb-b4d1-06b1b0f4da28" pod="openshift-must-gather-4sfj8/must-gather-npl86" err="pods \"must-gather-npl86\" is forbidden: User \"system:node:ip-10-0-143-63.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-4sfj8\": no relationship found between node 'ip-10-0-143-63.ec2.internal' and this object" Apr 23 19:04:29.578711 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.578691 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4sfj8_must-gather-npl86_a4537785-00f7-40bb-b4d1-06b1b0f4da28/copy/0.log" Apr 23 19:04:29.579008 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.578983 2578 generic.go:358] "Generic (PLEG): container finished" podID="a4537785-00f7-40bb-b4d1-06b1b0f4da28" containerID="54a34730aad2701a460ef3e440909c859644e78e2f16cb603722c8a0769938e5" exitCode=143 Apr 23 19:04:29.579075 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.579026 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4sfj8/must-gather-npl86" Apr 23 19:04:29.579131 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.579118 2578 scope.go:117] "RemoveContainer" containerID="54a34730aad2701a460ef3e440909c859644e78e2f16cb603722c8a0769938e5" Apr 23 19:04:29.580571 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.580518 2578 status_manager.go:895] "Failed to get status for pod" podUID="a4537785-00f7-40bb-b4d1-06b1b0f4da28" pod="openshift-must-gather-4sfj8/must-gather-npl86" err="pods \"must-gather-npl86\" is forbidden: User \"system:node:ip-10-0-143-63.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-4sfj8\": no relationship found between node 'ip-10-0-143-63.ec2.internal' and this object" Apr 23 19:04:29.586506 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.586488 2578 scope.go:117] "RemoveContainer" containerID="94d34ae2ff9b45932e34a5e2123250ecd726e1fc35b3340d7931f4efac18ac09" Apr 23 19:04:29.600072 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.600047 2578 scope.go:117] "RemoveContainer" containerID="54a34730aad2701a460ef3e440909c859644e78e2f16cb603722c8a0769938e5" Apr 23 19:04:29.600389 ip-10-0-143-63 kubenswrapper[2578]: E0423 19:04:29.600369 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54a34730aad2701a460ef3e440909c859644e78e2f16cb603722c8a0769938e5\": container with ID starting with 54a34730aad2701a460ef3e440909c859644e78e2f16cb603722c8a0769938e5 not found: ID does not exist" containerID="54a34730aad2701a460ef3e440909c859644e78e2f16cb603722c8a0769938e5" Apr 23 19:04:29.600465 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.600398 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54a34730aad2701a460ef3e440909c859644e78e2f16cb603722c8a0769938e5"} err="failed to get container status \"54a34730aad2701a460ef3e440909c859644e78e2f16cb603722c8a0769938e5\": rpc error: code = NotFound desc = could not find container \"54a34730aad2701a460ef3e440909c859644e78e2f16cb603722c8a0769938e5\": container with ID starting with 54a34730aad2701a460ef3e440909c859644e78e2f16cb603722c8a0769938e5 not found: ID does not exist" Apr 23 19:04:29.600465 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.600418 2578 scope.go:117] "RemoveContainer" containerID="94d34ae2ff9b45932e34a5e2123250ecd726e1fc35b3340d7931f4efac18ac09" Apr 23 19:04:29.600713 ip-10-0-143-63 kubenswrapper[2578]: E0423 19:04:29.600661 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94d34ae2ff9b45932e34a5e2123250ecd726e1fc35b3340d7931f4efac18ac09\": container with ID starting with 94d34ae2ff9b45932e34a5e2123250ecd726e1fc35b3340d7931f4efac18ac09 not found: ID does not exist" containerID="94d34ae2ff9b45932e34a5e2123250ecd726e1fc35b3340d7931f4efac18ac09" Apr 23 19:04:29.600713 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.600685 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94d34ae2ff9b45932e34a5e2123250ecd726e1fc35b3340d7931f4efac18ac09"} err="failed to get container status \"94d34ae2ff9b45932e34a5e2123250ecd726e1fc35b3340d7931f4efac18ac09\": rpc error: code = NotFound desc = could not find container \"94d34ae2ff9b45932e34a5e2123250ecd726e1fc35b3340d7931f4efac18ac09\": container with ID starting with 94d34ae2ff9b45932e34a5e2123250ecd726e1fc35b3340d7931f4efac18ac09 not found: ID does not exist" Apr 23 19:04:29.686997 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.686960 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a4537785-00f7-40bb-b4d1-06b1b0f4da28-must-gather-output\") pod \"a4537785-00f7-40bb-b4d1-06b1b0f4da28\" (UID: \"a4537785-00f7-40bb-b4d1-06b1b0f4da28\") " Apr 23 19:04:29.687149 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.687012 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkd9p\" (UniqueName: \"kubernetes.io/projected/a4537785-00f7-40bb-b4d1-06b1b0f4da28-kube-api-access-rkd9p\") pod \"a4537785-00f7-40bb-b4d1-06b1b0f4da28\" (UID: \"a4537785-00f7-40bb-b4d1-06b1b0f4da28\") " Apr 23 19:04:29.688604 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.688567 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4537785-00f7-40bb-b4d1-06b1b0f4da28-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a4537785-00f7-40bb-b4d1-06b1b0f4da28" (UID: "a4537785-00f7-40bb-b4d1-06b1b0f4da28"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:04:29.689234 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.689213 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4537785-00f7-40bb-b4d1-06b1b0f4da28-kube-api-access-rkd9p" (OuterVolumeSpecName: "kube-api-access-rkd9p") pod "a4537785-00f7-40bb-b4d1-06b1b0f4da28" (UID: "a4537785-00f7-40bb-b4d1-06b1b0f4da28"). InnerVolumeSpecName "kube-api-access-rkd9p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:04:29.787971 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.787930 2578 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a4537785-00f7-40bb-b4d1-06b1b0f4da28-must-gather-output\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 19:04:29.787971 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.787968 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rkd9p\" (UniqueName: \"kubernetes.io/projected/a4537785-00f7-40bb-b4d1-06b1b0f4da28-kube-api-access-rkd9p\") on node \"ip-10-0-143-63.ec2.internal\" DevicePath \"\"" Apr 23 19:04:29.889161 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:29.889124 2578 status_manager.go:895] "Failed to get status for pod" podUID="a4537785-00f7-40bb-b4d1-06b1b0f4da28" pod="openshift-must-gather-4sfj8/must-gather-npl86" err="pods \"must-gather-npl86\" is forbidden: User \"system:node:ip-10-0-143-63.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-4sfj8\": no relationship found between node 'ip-10-0-143-63.ec2.internal' and this object" Apr 23 19:04:30.358170 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:30.358133 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4537785-00f7-40bb-b4d1-06b1b0f4da28" path="/var/lib/kubelet/pods/a4537785-00f7-40bb-b4d1-06b1b0f4da28/volumes" Apr 23 19:04:31.422860 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:31.422824 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-77759d78bc-x8rbw_0d70a0dc-d43e-49a9-9f20-85985564bd98/metrics-server/0.log" Apr 23 19:04:31.452337 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:31.452297 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-f8bkz_15a23a51-53a4-43ca-950f-3449fae6160c/monitoring-plugin/0.log" Apr 23 19:04:31.572158 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:31.572123 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hcbrd_a4105c7e-1c6e-46a3-a884-5c701411dd9d/node-exporter/0.log" Apr 23 19:04:31.591265 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:31.591230 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hcbrd_a4105c7e-1c6e-46a3-a884-5c701411dd9d/kube-rbac-proxy/0.log" Apr 23 19:04:31.612595 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:31.612570 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hcbrd_a4105c7e-1c6e-46a3-a884-5c701411dd9d/init-textfile/0.log" Apr 23 19:04:31.713242 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:31.713154 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-b27pf_46d70cff-048a-4745-b44b-9b84f75b930e/kube-rbac-proxy-main/0.log" Apr 23 19:04:31.735800 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:31.735771 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-b27pf_46d70cff-048a-4745-b44b-9b84f75b930e/kube-rbac-proxy-self/0.log" Apr 23 19:04:31.760497 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:31.760463 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-b27pf_46d70cff-048a-4745-b44b-9b84f75b930e/openshift-state-metrics/0.log" Apr 23 19:04:31.999249 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:31.999164 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-gzbjc_b84d1492-2668-484a-a1a9-0c1404a30918/prometheus-operator/0.log" Apr 23 19:04:32.018831 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:32.018805 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-gzbjc_b84d1492-2668-484a-a1a9-0c1404a30918/kube-rbac-proxy/0.log" Apr 23 19:04:32.054996 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:32.054968 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-vsxkq_33a54c72-a227-4426-b514-e41232818756/prometheus-operator-admission-webhook/0.log" Apr 23 19:04:32.100463 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:32.100424 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-54cfcb5d4b-ngzbz_2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc/telemeter-client/0.log" Apr 23 19:04:32.121119 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:32.121086 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-54cfcb5d4b-ngzbz_2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc/reload/0.log" Apr 23 19:04:32.142887 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:32.142849 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-54cfcb5d4b-ngzbz_2fcc7b2c-3477-450d-bd6a-b9d51c4abbcc/kube-rbac-proxy/0.log" Apr 23 19:04:33.902358 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:33.902327 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/1.log" Apr 23 19:04:33.910637 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:33.910598 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d447l_49f66b8f-5bbb-4d67-9dfd-cd24fb73773b/console-operator/2.log" Apr 23 19:04:34.288760 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.288734 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64444cd4cc-bz564_499a7a87-1fe5-4e78-8769-1cdd08a5ad3c/console/0.log" Apr 23 19:04:34.620893 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.620797 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9"] Apr 23 19:04:34.621216 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.621198 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4537785-00f7-40bb-b4d1-06b1b0f4da28" containerName="copy" Apr 23 19:04:34.621304 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.621219 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4537785-00f7-40bb-b4d1-06b1b0f4da28" containerName="copy" Apr 23 19:04:34.621304 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.621251 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4537785-00f7-40bb-b4d1-06b1b0f4da28" containerName="gather" Apr 23 19:04:34.621304 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.621260 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4537785-00f7-40bb-b4d1-06b1b0f4da28" containerName="gather" Apr 23 19:04:34.621463 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.621346 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4537785-00f7-40bb-b4d1-06b1b0f4da28" containerName="gather" Apr 23 19:04:34.621463 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.621359 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4537785-00f7-40bb-b4d1-06b1b0f4da28" containerName="copy" Apr 23 19:04:34.626582 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.626556 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9" Apr 23 19:04:34.628456 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.628435 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-25xv6\"/\"default-dockercfg-mzws8\"" Apr 23 19:04:34.629073 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.629055 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-25xv6\"/\"kube-root-ca.crt\"" Apr 23 19:04:34.629128 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.629063 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-25xv6\"/\"openshift-service-ca.crt\"" Apr 23 19:04:34.634150 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.634115 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9"] Apr 23 19:04:34.729504 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.729467 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/056ee1be-ebc6-43c0-8f00-f7c35aba56c1-lib-modules\") pod \"perf-node-gather-daemonset-spgb9\" (UID: \"056ee1be-ebc6-43c0-8f00-f7c35aba56c1\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9" Apr 23 19:04:34.729751 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.729520 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/056ee1be-ebc6-43c0-8f00-f7c35aba56c1-sys\") pod \"perf-node-gather-daemonset-spgb9\" (UID: \"056ee1be-ebc6-43c0-8f00-f7c35aba56c1\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9" Apr 23 19:04:34.729751 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.729628 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt8bq\" (UniqueName: \"kubernetes.io/projected/056ee1be-ebc6-43c0-8f00-f7c35aba56c1-kube-api-access-lt8bq\") pod \"perf-node-gather-daemonset-spgb9\" (UID: \"056ee1be-ebc6-43c0-8f00-f7c35aba56c1\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9" Apr 23 19:04:34.729751 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.729684 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/056ee1be-ebc6-43c0-8f00-f7c35aba56c1-podres\") pod \"perf-node-gather-daemonset-spgb9\" (UID: \"056ee1be-ebc6-43c0-8f00-f7c35aba56c1\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9" Apr 23 19:04:34.729751 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.729713 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/056ee1be-ebc6-43c0-8f00-f7c35aba56c1-proc\") pod \"perf-node-gather-daemonset-spgb9\" (UID: \"056ee1be-ebc6-43c0-8f00-f7c35aba56c1\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9" Apr 23 19:04:34.791583 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.791528 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-5jh6m_f49a4cd8-3fa6-403a-b269-d6e4ed51f7c4/volume-data-source-validator/0.log" Apr 23 19:04:34.830989 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.830950 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/056ee1be-ebc6-43c0-8f00-f7c35aba56c1-sys\") pod \"perf-node-gather-daemonset-spgb9\" (UID: \"056ee1be-ebc6-43c0-8f00-f7c35aba56c1\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9" Apr 23 19:04:34.831150 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.831004 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lt8bq\" (UniqueName: \"kubernetes.io/projected/056ee1be-ebc6-43c0-8f00-f7c35aba56c1-kube-api-access-lt8bq\") pod \"perf-node-gather-daemonset-spgb9\" (UID: \"056ee1be-ebc6-43c0-8f00-f7c35aba56c1\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9" Apr 23 19:04:34.831150 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.831040 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/056ee1be-ebc6-43c0-8f00-f7c35aba56c1-podres\") pod \"perf-node-gather-daemonset-spgb9\" (UID: \"056ee1be-ebc6-43c0-8f00-f7c35aba56c1\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9" Apr 23 19:04:34.831150 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.831059 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/056ee1be-ebc6-43c0-8f00-f7c35aba56c1-proc\") pod \"perf-node-gather-daemonset-spgb9\" (UID: \"056ee1be-ebc6-43c0-8f00-f7c35aba56c1\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9" Apr 23 19:04:34.831150 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.831090 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/056ee1be-ebc6-43c0-8f00-f7c35aba56c1-sys\") pod \"perf-node-gather-daemonset-spgb9\" (UID: \"056ee1be-ebc6-43c0-8f00-f7c35aba56c1\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9" Apr 23 19:04:34.831150 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.831103 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/056ee1be-ebc6-43c0-8f00-f7c35aba56c1-lib-modules\") pod \"perf-node-gather-daemonset-spgb9\" (UID: \"056ee1be-ebc6-43c0-8f00-f7c35aba56c1\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9" Apr 23 19:04:34.831150 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.831138 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/056ee1be-ebc6-43c0-8f00-f7c35aba56c1-proc\") pod \"perf-node-gather-daemonset-spgb9\" (UID: \"056ee1be-ebc6-43c0-8f00-f7c35aba56c1\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9" Apr 23 19:04:34.831364 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.831215 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/056ee1be-ebc6-43c0-8f00-f7c35aba56c1-podres\") pod \"perf-node-gather-daemonset-spgb9\" (UID: \"056ee1be-ebc6-43c0-8f00-f7c35aba56c1\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9" Apr 23 19:04:34.831364 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.831277 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/056ee1be-ebc6-43c0-8f00-f7c35aba56c1-lib-modules\") pod \"perf-node-gather-daemonset-spgb9\" (UID: \"056ee1be-ebc6-43c0-8f00-f7c35aba56c1\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9" Apr 23 19:04:34.839165 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.839124 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt8bq\" (UniqueName: \"kubernetes.io/projected/056ee1be-ebc6-43c0-8f00-f7c35aba56c1-kube-api-access-lt8bq\") pod \"perf-node-gather-daemonset-spgb9\" (UID: \"056ee1be-ebc6-43c0-8f00-f7c35aba56c1\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9" Apr 23 19:04:34.937191 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:34.937097 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9" Apr 23 19:04:35.064683 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:35.064563 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9"] Apr 23 19:04:35.067448 ip-10-0-143-63 kubenswrapper[2578]: W0423 19:04:35.067417 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod056ee1be_ebc6_43c0_8f00_f7c35aba56c1.slice/crio-6e0cd0be8cfe04d37e22999b2d52cba690ed9812ce90bc91c60cac90d976ce8c WatchSource:0}: Error finding container 6e0cd0be8cfe04d37e22999b2d52cba690ed9812ce90bc91c60cac90d976ce8c: Status 404 returned error can't find the container with id 6e0cd0be8cfe04d37e22999b2d52cba690ed9812ce90bc91c60cac90d976ce8c Apr 23 19:04:35.547103 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:35.547069 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bgn7x_5240d464-6fd9-4f8a-819f-0385f4314995/dns/0.log" Apr 23 19:04:35.568525 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:35.568488 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bgn7x_5240d464-6fd9-4f8a-819f-0385f4314995/kube-rbac-proxy/0.log" Apr 23 19:04:35.601185 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:35.601145 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9" event={"ID":"056ee1be-ebc6-43c0-8f00-f7c35aba56c1","Type":"ContainerStarted","Data":"fdc9c3b9563d6a03b7cde700a4e40dc8a171bb39d64a46d0288d1457835ee134"} Apr 23 19:04:35.601185 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:35.601188 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9" event={"ID":"056ee1be-ebc6-43c0-8f00-f7c35aba56c1","Type":"ContainerStarted","Data":"6e0cd0be8cfe04d37e22999b2d52cba690ed9812ce90bc91c60cac90d976ce8c"} Apr 23 19:04:35.601418 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:35.601222 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9" Apr 23 19:04:35.617236 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:35.617173 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9" podStartSLOduration=1.617154583 podStartE2EDuration="1.617154583s" podCreationTimestamp="2026-04-23 19:04:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 19:04:35.615053063 +0000 UTC m=+3959.848794979" watchObservedRunningTime="2026-04-23 19:04:35.617154583 +0000 UTC m=+3959.850896498" Apr 23 19:04:35.669220 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:35.669190 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hl5qq_d8d9c074-5a2a-4898-b910-f1a16ffc62fc/dns-node-resolver/0.log" Apr 23 19:04:36.206370 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:36.206325 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-l4wm6_22a39804-db9b-4a6b-a927-b5f0bb1d22eb/node-ca/0.log" Apr 23 19:04:36.956154 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:36.956036 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-64c9b47658-qqqmr_778d33bd-ade8-4471-a0d0-10670f14a624/router/0.log" Apr 23 19:04:37.309239 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:37.309201 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-86jl7_ff148188-17a2-4b88-a857-ae14164f4a06/serve-healthcheck-canary/0.log" Apr 23 19:04:37.717273 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:37.717169 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8mlfw_ea634ca7-4a3e-497f-a8d4-a4443b1dcf50/kube-rbac-proxy/0.log" Apr 23 19:04:37.737883 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:37.737854 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8mlfw_ea634ca7-4a3e-497f-a8d4-a4443b1dcf50/exporter/0.log" Apr 23 19:04:37.759432 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:37.759398 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8mlfw_ea634ca7-4a3e-497f-a8d4-a4443b1dcf50/extractor/0.log" Apr 23 19:04:40.117403 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:40.117366 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-874ff48d-4pn4f_53ed2eb6-0c4c-434b-ab08-e950a4695d38/manager/0.log" Apr 23 19:04:40.150392 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:40.150361 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-snz24_9c35681e-156f-4eae-8f46-35c66086eb3e/manager/0.log" Apr 23 19:04:40.608853 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:40.608810 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-lhlk7_c873385c-79c2-484a-9276-a053d3fe4743/manager/0.log" Apr 23 19:04:40.635403 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:40.635363 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-mls9c_d1d0fa94-8970-4c49-9bad-c7a1b218c2a0/s3-init/0.log" Apr 23 19:04:40.657165 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:40.657138 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-svf87_cac62d37-3ae2-41f0-b2ef-81e680878bd4/s3-tls-init-custom/0.log" Apr 23 19:04:40.764191 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:40.764145 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-cqtwd_16fe4703-96ab-4ece-9f48-0f51e78658ad/seaweedfs-tls-serving/0.log" Apr 23 19:04:41.615459 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:41.615431 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-spgb9" Apr 23 19:04:45.155369 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:45.155319 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-qhb84_0af64194-8451-4345-9044-583d24fa444c/kube-storage-version-migrator-operator/1.log" Apr 23 19:04:45.157713 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:45.157682 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-qhb84_0af64194-8451-4345-9044-583d24fa444c/kube-storage-version-migrator-operator/0.log" Apr 23 19:04:46.104783 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:46.104752 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8d8vh_c04a156a-dd80-4859-a932-b0e25e9bce6b/kube-multus-additional-cni-plugins/0.log" Apr 23 19:04:46.129370 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:46.129334 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8d8vh_c04a156a-dd80-4859-a932-b0e25e9bce6b/egress-router-binary-copy/0.log" Apr 23 19:04:46.154607 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:46.154574 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8d8vh_c04a156a-dd80-4859-a932-b0e25e9bce6b/cni-plugins/0.log" Apr 23 19:04:46.177267 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:46.177232 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8d8vh_c04a156a-dd80-4859-a932-b0e25e9bce6b/bond-cni-plugin/0.log" Apr 23 19:04:46.199344 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:46.199310 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8d8vh_c04a156a-dd80-4859-a932-b0e25e9bce6b/routeoverride-cni/0.log" Apr 23 19:04:46.250557 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:46.250398 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8d8vh_c04a156a-dd80-4859-a932-b0e25e9bce6b/whereabouts-cni-bincopy/0.log" Apr 23 19:04:46.269503 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:46.269473 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8d8vh_c04a156a-dd80-4859-a932-b0e25e9bce6b/whereabouts-cni/0.log" Apr 23 19:04:46.747486 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:46.747434 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rtg5v_9d9582b8-817a-4d02-862f-e5bbde6a1652/kube-multus/0.log" Apr 23 19:04:46.945038 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:46.944980 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xwp2q_c5673cab-427f-416d-a4ba-94ac7c29dc9c/network-metrics-daemon/0.log" Apr 23 19:04:46.977957 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:46.977921 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xwp2q_c5673cab-427f-416d-a4ba-94ac7c29dc9c/kube-rbac-proxy/0.log" Apr 23 19:04:48.351695 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:48.351660 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-controller/0.log" Apr 23 19:04:48.371744 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:48.371701 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/0.log" Apr 23 19:04:48.408793 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:48.408758 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovn-acl-logging/1.log" Apr 23 19:04:48.432189 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:48.432145 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/kube-rbac-proxy-node/0.log" Apr 23 19:04:48.455628 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:48.455595 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 19:04:48.472571 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:48.472512 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/northd/0.log" Apr 23 19:04:48.494215 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:48.494173 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/nbdb/0.log" Apr 23 19:04:48.515012 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:48.514970 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/sbdb/0.log" Apr 23 19:04:48.696604 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:48.696495 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-978cv_246e705b-d502-4b42-bea0-4b6149b86183/ovnkube-controller/0.log" Apr 23 19:04:50.033391 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:50.033354 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-xfl9g_c0426005-0ecf-4d42-aaec-e90027db197e/check-endpoints/0.log" Apr 23 19:04:50.108729 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:50.108700 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-gtd8z_979ab58c-b655-4aab-94f9-8920472712df/network-check-target-container/0.log" Apr 23 19:04:51.066843 ip-10-0-143-63 kubenswrapper[2578]: I0423 19:04:51.066810 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-tprc2_92df26fb-43f5-4d39-9c51-669235fa190e/iptables-alerter/0.log"