Apr 16 22:10:58.973330 ip-10-0-138-191 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 22:10:58.973341 ip-10-0-138-191 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 22:10:58.973348 ip-10-0-138-191 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 22:10:58.973568 ip-10-0-138-191 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 22:11:08.978468 ip-10-0-138-191 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 22:11:08.978489 ip-10-0-138-191 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 678b9c99428241fba37ded1fbc526d39 -- Apr 16 22:13:32.600248 ip-10-0-138-191 systemd[1]: Starting Kubernetes Kubelet... Apr 16 22:13:33.176089 ip-10-0-138-191 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:33.176089 ip-10-0-138-191 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 22:13:33.176089 ip-10-0-138-191 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:33.176089 ip-10-0-138-191 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 22:13:33.176089 ip-10-0-138-191 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:33.177823 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.177730 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 22:13:33.182008 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.181992 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:33.182008 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182009 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:33.182078 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182013 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:33.182078 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182018 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:33.182078 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182021 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:33.182078 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182025 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:33.182078 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182028 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:33.182078 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182031 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:33.182078 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182034 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:33.182078 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182037 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:33.182078 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182040 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:33.182078 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182043 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:33.182078 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182045 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:33.182078 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182048 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:33.182078 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182052 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:33.182078 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182054 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:33.182078 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182057 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:33.182078 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182060 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:33.182078 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182062 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:33.182078 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182065 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:33.182078 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182067 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:33.182078 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182070 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:33.182561 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182072 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:33.182561 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182075 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:33.182561 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182078 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:33.182561 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182080 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:33.182561 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182083 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:33.182561 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182086 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:33.182561 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182089 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:33.182561 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182091 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:33.182561 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182094 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:33.182561 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182096 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:33.182561 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182099 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:33.182561 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182102 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:33.182561 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182104 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:33.182561 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182107 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:33.182561 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182110 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:33.182561 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182113 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:33.182561 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182115 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:33.182561 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182117 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:33.182561 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182120 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:33.183077 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182123 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:33.183077 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182125 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:33.183077 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182128 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:33.183077 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182130 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:33.183077 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182133 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:33.183077 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182135 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:33.183077 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182138 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:33.183077 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182140 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:33.183077 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182142 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:33.183077 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182145 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:33.183077 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182147 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:33.183077 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182150 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:33.183077 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182152 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:33.183077 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182156 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:33.183077 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182161 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:33.183077 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182164 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:33.183077 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182168 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:33.183077 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182171 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:33.183077 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182174 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:33.183565 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182176 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:33.183565 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182179 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:33.183565 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182181 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:33.183565 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182184 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:33.183565 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182187 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:33.183565 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182189 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:33.183565 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182192 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:33.183565 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182195 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:33.183565 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182197 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:33.183565 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182200 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:33.183565 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182202 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:33.183565 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182205 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:33.183565 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182208 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:33.183565 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182211 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:33.183565 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182214 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:33.183565 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182217 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:33.183565 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182219 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:33.183565 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182222 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:33.183565 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182225 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:33.183565 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182227 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:33.184073 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182230 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:33.184073 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182233 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:33.184073 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182237 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:33.184073 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182240 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:33.184073 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182243 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:33.184073 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182246 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:33.184073 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182632 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:33.184073 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182637 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:33.184073 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182640 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:33.184073 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182642 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:33.184073 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182645 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:33.184073 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182648 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:33.184073 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182651 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:33.184073 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182653 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:33.184073 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182656 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:33.184073 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182659 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:33.184073 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182662 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:33.184073 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182664 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:33.184073 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182667 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:33.184073 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182670 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:33.184587 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182672 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:33.184587 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182675 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:33.184587 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182678 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:33.184587 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182680 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:33.184587 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182683 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:33.184587 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182685 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:33.184587 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182688 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:33.184587 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182690 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:33.184587 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182693 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:33.184587 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182695 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:33.184587 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182698 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:33.184587 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182702 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:33.184587 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182704 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:33.184587 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182707 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:33.184587 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182710 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:33.184587 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182714 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:33.184587 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182716 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:33.184587 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182719 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:33.184587 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182721 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:33.185091 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182725 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:33.185091 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182729 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:33.185091 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182732 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:33.185091 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182734 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:33.185091 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182737 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:33.185091 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182740 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:33.185091 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182742 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:33.185091 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182745 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:33.185091 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182747 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:33.185091 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182750 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:33.185091 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182753 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:33.185091 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182755 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:33.185091 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182760 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:33.185091 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182762 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:33.185091 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182765 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:33.185091 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182767 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:33.185091 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182770 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:33.185091 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182772 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:33.185091 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182775 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:33.185091 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182777 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:33.185618 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182780 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:33.185618 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182782 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:33.185618 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182785 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:33.185618 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182787 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:33.185618 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182790 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:33.185618 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182793 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:33.185618 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182795 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:33.185618 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182798 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:33.185618 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182800 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:33.185618 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182803 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:33.185618 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182806 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:33.185618 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182808 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:33.185618 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182811 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:33.185618 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182814 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:33.185618 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182816 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:33.185618 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182819 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:33.185618 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182822 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:33.185618 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182826 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:33.185618 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182829 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:33.186105 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182832 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:33.186105 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182835 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:33.186105 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182838 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:33.186105 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182840 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:33.186105 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182843 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:33.186105 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182845 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:33.186105 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182849 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:33.186105 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182851 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:33.186105 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182854 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:33.186105 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182856 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:33.186105 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182859 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:33.186105 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182861 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:33.186105 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182864 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:33.186105 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.182866 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:33.186105 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183664 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 22:13:33.186105 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183673 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 22:13:33.186105 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183680 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 22:13:33.186105 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183685 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 22:13:33.186105 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183694 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 22:13:33.186105 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183698 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183703 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183708 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183711 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183714 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183717 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183721 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183724 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183727 2574 flags.go:64] FLAG: --cgroup-root="" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183730 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183733 2574 flags.go:64] FLAG: --client-ca-file="" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183736 2574 flags.go:64] FLAG: --cloud-config="" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183739 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183742 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183747 2574 flags.go:64] FLAG: --cluster-domain="" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183750 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183753 2574 flags.go:64] FLAG: --config-dir="" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183759 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183763 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183767 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183777 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183780 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183783 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183787 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 16 22:13:33.186597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183789 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183793 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183796 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183799 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183803 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183806 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183809 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183812 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183815 2574 flags.go:64] FLAG: --enable-server="true" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183818 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183829 2574 flags.go:64] FLAG: --event-burst="100" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183832 2574 flags.go:64] FLAG: --event-qps="50" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183835 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183839 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183842 2574 flags.go:64] FLAG: --eviction-hard="" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183846 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183848 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183852 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183855 2574 flags.go:64] FLAG: --eviction-soft="" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183858 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183860 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183863 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183866 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183869 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183872 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 22:13:33.187197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183876 2574 flags.go:64] FLAG: --feature-gates="" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183880 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183883 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183886 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183895 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183898 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183901 2574 flags.go:64] FLAG: --help="false" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183904 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-138-191.ec2.internal" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183908 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183911 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183914 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183917 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183920 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183924 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183927 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183945 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183948 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183951 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183955 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183957 2574 flags.go:64] FLAG: --kube-reserved="" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183961 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183963 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183967 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183969 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 22:13:33.187942 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183972 2574 flags.go:64] FLAG: --lock-file="" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183975 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183978 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183981 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183987 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183990 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183993 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.183996 2574 flags.go:64] FLAG: --logging-format="text" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184000 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184003 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184006 2574 flags.go:64] FLAG: --manifest-url="" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184009 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184013 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184017 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184021 2574 flags.go:64] FLAG: --max-pods="110" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184024 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184027 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184030 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184033 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184036 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184039 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184045 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184054 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184057 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184060 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 22:13:33.188536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184063 2574 flags.go:64] FLAG: --pod-cidr="" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184066 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184072 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184075 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184078 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184081 2574 flags.go:64] FLAG: --port="10250" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184085 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184088 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0644491e1273f7bae" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184091 2574 flags.go:64] FLAG: --qos-reserved="" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184094 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184097 2574 flags.go:64] FLAG: --register-node="true" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184100 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184104 2574 flags.go:64] FLAG: --register-with-taints="" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184107 2574 flags.go:64] FLAG: --registry-burst="10" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184110 2574 flags.go:64] FLAG: --registry-qps="5" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184114 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184117 2574 flags.go:64] FLAG: --reserved-memory="" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184121 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184124 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184127 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184130 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184134 2574 flags.go:64] FLAG: --runonce="false" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184137 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184140 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184143 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 16 22:13:33.189167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184146 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184149 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184152 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184155 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184159 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184163 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184166 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184169 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184172 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184175 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184178 2574 flags.go:64] FLAG: --system-cgroups="" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184181 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184186 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184189 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184192 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184196 2574 flags.go:64] FLAG: --tls-min-version="" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184200 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184202 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184205 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184208 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184212 2574 flags.go:64] FLAG: --v="2" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184216 2574 flags.go:64] FLAG: --version="false" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184222 2574 flags.go:64] FLAG: --vmodule="" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184227 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.184230 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 22:13:33.189765 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184330 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:33.190388 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184334 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:33.190388 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184337 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:33.190388 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184340 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:33.190388 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184344 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:33.190388 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184347 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:33.190388 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184350 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:33.190388 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184354 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:33.190388 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184357 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:33.190388 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184359 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:33.190388 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184362 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:33.190388 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184365 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:33.190388 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184368 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:33.190388 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184372 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:33.190388 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184376 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:33.190388 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184379 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:33.190388 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184382 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:33.190388 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184385 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:33.190388 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184387 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:33.190388 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184390 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:33.190858 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184392 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:33.190858 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184395 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:33.190858 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184397 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:33.190858 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184400 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:33.190858 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184402 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:33.190858 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184405 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:33.190858 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184408 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:33.190858 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184410 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:33.190858 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184412 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:33.190858 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184417 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:33.190858 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184419 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:33.190858 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184422 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:33.190858 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184424 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:33.190858 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184427 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:33.190858 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184429 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:33.190858 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184432 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:33.190858 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184434 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:33.190858 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184438 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:33.190858 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184440 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:33.190858 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184443 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:33.191429 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184445 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:33.191429 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184448 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:33.191429 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184450 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:33.191429 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184454 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:33.191429 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184456 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:33.191429 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184459 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:33.191429 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184461 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:33.191429 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184464 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:33.191429 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184466 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:33.191429 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184469 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:33.191429 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184472 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:33.191429 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184474 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:33.191429 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184477 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:33.191429 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184479 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:33.191429 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184482 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:33.191429 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184484 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:33.191429 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184487 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:33.191429 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184489 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:33.191429 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184492 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:33.191906 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184495 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:33.191906 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184497 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:33.191906 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184504 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:33.191906 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184507 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:33.191906 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184509 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:33.191906 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184512 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:33.191906 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184514 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:33.191906 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184516 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:33.191906 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184520 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:33.191906 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184523 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:33.191906 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184525 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:33.191906 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184528 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:33.191906 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184531 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:33.191906 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184534 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:33.191906 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184537 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:33.191906 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184539 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:33.191906 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184543 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:33.191906 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184546 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:33.191906 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184548 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:33.191906 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184551 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:33.192402 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184553 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:33.192402 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184556 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:33.192402 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184558 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:33.192402 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184561 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:33.192402 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184563 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:33.192402 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184566 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:33.192402 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.184569 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:33.192402 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.185314 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:33.192402 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.191800 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 22:13:33.192402 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.191816 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 22:13:33.192402 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191870 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:33.192402 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191874 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:33.192402 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191878 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:33.192402 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191881 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:33.192402 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191884 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:33.192402 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191888 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:33.192798 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191890 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:33.192798 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191893 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:33.192798 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191896 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:33.192798 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191899 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:33.192798 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191902 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:33.192798 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191905 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:33.192798 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191907 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:33.192798 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191910 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:33.192798 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191913 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:33.192798 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191916 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:33.192798 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191919 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:33.192798 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191922 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:33.192798 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191924 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:33.192798 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191942 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:33.192798 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191947 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:33.192798 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191950 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:33.192798 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191953 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:33.192798 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191957 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:33.192798 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191961 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:33.193294 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191964 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:33.193294 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191967 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:33.193294 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191970 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:33.193294 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191972 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:33.193294 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191975 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:33.193294 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.191994 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:33.193294 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192004 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:33.193294 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192007 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:33.193294 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192011 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:33.193294 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192016 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:33.193294 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192019 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:33.193294 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192022 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:33.193294 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192025 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:33.193294 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192028 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:33.193294 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192030 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:33.193294 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192033 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:33.193294 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192035 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:33.193294 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192038 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:33.193294 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192040 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:33.193749 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192043 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:33.193749 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192045 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:33.193749 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192048 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:33.193749 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192051 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:33.193749 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192053 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:33.193749 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192055 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:33.193749 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192058 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:33.193749 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192061 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:33.193749 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192063 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:33.193749 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192066 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:33.193749 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192068 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:33.193749 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192071 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:33.193749 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192074 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:33.193749 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192076 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:33.193749 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192079 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:33.193749 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192082 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:33.193749 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192084 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:33.193749 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192087 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:33.193749 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192090 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:33.193749 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192093 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:33.194254 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192102 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:33.194254 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192105 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:33.194254 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192108 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:33.194254 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192110 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:33.194254 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192113 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:33.194254 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192116 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:33.194254 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192118 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:33.194254 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192121 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:33.194254 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192123 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:33.194254 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192126 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:33.194254 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192129 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:33.194254 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192131 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:33.194254 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192134 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:33.194254 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192136 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:33.194254 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192139 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:33.194254 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192141 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:33.194254 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192143 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:33.194254 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192146 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:33.194254 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192149 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:33.194254 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192152 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:33.194786 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192154 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:33.194786 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192157 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:33.194786 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.192161 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:33.194786 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192287 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:33.194786 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192292 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:33.194786 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192295 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:33.194786 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192298 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:33.194786 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192301 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:33.194786 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192304 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:33.194786 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192307 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:33.194786 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192310 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:33.194786 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192313 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:33.194786 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192315 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:33.194786 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192325 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:33.194786 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192327 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:33.195180 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192330 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:33.195180 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192333 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:33.195180 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192336 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:33.195180 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192338 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:33.195180 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192341 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:33.195180 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192343 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:33.195180 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192345 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:33.195180 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192348 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:33.195180 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192350 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:33.195180 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192353 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:33.195180 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192356 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:33.195180 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192358 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:33.195180 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192362 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:33.195180 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192366 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:33.195180 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192369 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:33.195180 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192371 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:33.195180 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192374 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:33.195180 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192376 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:33.195180 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192379 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:33.195640 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192381 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:33.195640 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192384 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:33.195640 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192386 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:33.195640 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192389 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:33.195640 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192392 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:33.195640 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192394 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:33.195640 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192397 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:33.195640 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192399 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:33.195640 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192401 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:33.195640 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192405 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:33.195640 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192408 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:33.195640 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192410 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:33.195640 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192418 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:33.195640 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192422 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:33.195640 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192424 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:33.195640 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192427 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:33.195640 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192429 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:33.195640 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192432 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:33.195640 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192434 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:33.195640 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192436 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:33.196174 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192439 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:33.196174 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192441 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:33.196174 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192444 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:33.196174 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192447 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:33.196174 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192449 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:33.196174 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192452 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:33.196174 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192454 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:33.196174 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192459 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:33.196174 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192462 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:33.196174 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192465 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:33.196174 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192468 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:33.196174 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192470 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:33.196174 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192473 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:33.196174 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192476 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:33.196174 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192479 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:33.196174 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192481 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:33.196174 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192484 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:33.196174 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192487 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:33.196174 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192489 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:33.196174 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192491 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:33.196661 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192494 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:33.196661 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192496 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:33.196661 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192499 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:33.196661 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192502 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:33.196661 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192504 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:33.196661 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192512 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:33.196661 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192515 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:33.196661 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192517 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:33.196661 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192520 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:33.196661 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192522 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:33.196661 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192525 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:33.196661 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192527 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:33.196661 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192529 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:33.196661 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192532 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:33.196661 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:33.192534 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:33.197077 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.192539 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:33.197077 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.193468 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 22:13:33.197077 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.195634 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 22:13:33.197077 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.196585 2574 server.go:1019] "Starting client certificate rotation" Apr 16 22:13:33.197077 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.196687 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:33.198484 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.198470 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:33.225385 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.225362 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:33.233145 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.233110 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:33.249060 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.249034 2574 log.go:25] "Validated CRI v1 runtime API" Apr 16 22:13:33.256665 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.256647 2574 log.go:25] "Validated CRI v1 image API" Apr 16 22:13:33.258847 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.258821 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 22:13:33.263471 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.263452 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:33.263707 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.263687 2574 fs.go:135] Filesystem UUIDs: map[475e0b1b-a219-43b8-9b42-7ea2c3711746:/dev/nvme0n1p4 5a53d318-ad49-44c8-908c-619e5a945282:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 16 22:13:33.263751 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.263708 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 22:13:33.269432 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.269315 2574 manager.go:217] Machine: {Timestamp:2026-04-16 22:13:33.267979604 +0000 UTC m=+0.510795657 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3102236 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2961c69185403e2e3c2f0e929a409e SystemUUID:ec2961c6-9185-403e-2e3c-2f0e929a409e BootID:678b9c99-4282-41fb-a37d-ed1fbc526d39 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:9a:d6:dc:ce:61 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:9a:d6:dc:ce:61 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ba:50:11:95:16:ea Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 22:13:33.269432 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.269428 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 22:13:33.269535 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.269508 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 22:13:33.269868 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.269844 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 22:13:33.270062 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.269870 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-191.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 22:13:33.270109 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.270072 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 22:13:33.270109 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.270088 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 22:13:33.270164 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.270113 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:33.270738 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.270728 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:33.271955 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.271944 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:33.272065 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.272056 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 22:13:33.275488 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.275476 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 16 22:13:33.275535 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.275497 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 22:13:33.275535 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.275514 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 22:13:33.275535 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.275523 2574 kubelet.go:397] "Adding apiserver pod source" Apr 16 22:13:33.275535 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.275531 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 22:13:33.276597 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.276585 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:33.276642 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.276603 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:33.280352 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.280332 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 22:13:33.281771 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.281757 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 22:13:33.285612 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.285587 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 22:13:33.285700 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.285599 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dsvnx" Apr 16 22:13:33.285700 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.285626 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 22:13:33.285700 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.285639 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 22:13:33.285700 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.285649 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 22:13:33.285700 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.285664 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 22:13:33.285700 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.285680 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 22:13:33.285700 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.285691 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 22:13:33.285700 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.285699 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 22:13:33.285982 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.285711 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 22:13:33.285982 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.285721 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 22:13:33.285982 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.285736 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 22:13:33.285982 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.285750 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 22:13:33.286940 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.286898 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 22:13:33.287044 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.286951 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 22:13:33.290854 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.290837 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-191.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 22:13:33.290996 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.290985 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 22:13:33.291040 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.291029 2574 server.go:1295] "Started kubelet" Apr 16 22:13:33.291329 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.291306 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 22:13:33.291493 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.291453 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 22:13:33.291531 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.291521 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 22:13:33.292765 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:33.292721 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 22:13:33.292899 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.292869 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dsvnx" Apr 16 22:13:33.293187 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.293171 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 22:13:33.293338 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:33.293320 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-191.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 22:13:33.293570 ip-10-0-138-191 systemd[1]: Started Kubernetes Kubelet. Apr 16 22:13:33.293705 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.293691 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 16 22:13:33.299479 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:33.298168 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-191.ec2.internal.18a6f607132a8320 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-191.ec2.internal,UID:ip-10-0-138-191.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-191.ec2.internal,},FirstTimestamp:2026-04-16 22:13:33.290996512 +0000 UTC m=+0.533812567,LastTimestamp:2026-04-16 22:13:33.290996512 +0000 UTC m=+0.533812567,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-191.ec2.internal,}" Apr 16 22:13:33.305403 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:33.305372 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 22:13:33.306905 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.306889 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 22:13:33.306969 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.306952 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:33.307922 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.307861 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 22:13:33.307922 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.307863 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 22:13:33.307922 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.307883 2574 factory.go:153] Registering CRI-O factory Apr 16 22:13:33.307922 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.307895 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 22:13:33.308160 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.307959 2574 factory.go:223] Registration of the crio container factory successfully Apr 16 22:13:33.308160 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.307971 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 16 22:13:33.308160 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.308014 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 16 22:13:33.308160 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.308017 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 22:13:33.308160 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.308029 2574 factory.go:55] Registering systemd factory Apr 16 22:13:33.308160 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.308038 2574 factory.go:223] Registration of the systemd container factory successfully Apr 16 22:13:33.308160 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.308055 2574 factory.go:103] Registering Raw factory Apr 16 22:13:33.308160 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.308068 2574 manager.go:1196] Started watching for new ooms in manager Apr 16 22:13:33.308160 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:33.308092 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-191.ec2.internal\" not found" Apr 16 22:13:33.308550 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.308481 2574 manager.go:319] Starting recovery of all containers Apr 16 22:13:33.309507 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.309478 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:33.312771 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:33.312604 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-138-191.ec2.internal\" not found" node="ip-10-0-138-191.ec2.internal" Apr 16 22:13:33.318726 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.318701 2574 manager.go:324] Recovery completed Apr 16 22:13:33.324037 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.324018 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:33.326755 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.326740 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:33.326827 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.326768 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:33.326827 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.326779 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:33.327258 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.327236 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 22:13:33.327258 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.327246 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 22:13:33.327346 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.327263 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:33.331100 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.331088 2574 policy_none.go:49] "None policy: Start" Apr 16 22:13:33.331142 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.331105 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 22:13:33.331142 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.331115 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 16 22:13:33.394416 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.371515 2574 manager.go:341] "Starting Device Plugin manager" Apr 16 22:13:33.394416 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:33.371543 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 22:13:33.394416 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.371555 2574 server.go:85] "Starting device plugin registration server" Apr 16 22:13:33.394416 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.371758 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 22:13:33.394416 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.371771 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 22:13:33.394416 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.371858 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 22:13:33.394416 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.371969 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 22:13:33.394416 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.371978 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 22:13:33.394416 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:33.372372 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 22:13:33.394416 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:33.372406 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-191.ec2.internal\" not found" Apr 16 22:13:33.442476 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.442403 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 22:13:33.443627 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.443606 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 22:13:33.443692 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.443636 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 22:13:33.443692 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.443660 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 22:13:33.443692 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.443671 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 22:13:33.443811 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:33.443714 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 22:13:33.446883 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.446861 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:33.472651 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.472632 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:33.473639 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.473624 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:33.473731 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.473655 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:33.473731 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.473666 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:33.473731 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.473690 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-191.ec2.internal" Apr 16 22:13:33.479912 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.479898 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-191.ec2.internal" Apr 16 22:13:33.479979 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:33.479920 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-191.ec2.internal\": node \"ip-10-0-138-191.ec2.internal\" not found" Apr 16 22:13:33.494148 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:33.494124 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-191.ec2.internal\" not found" Apr 16 22:13:33.543971 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.543915 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-191.ec2.internal"] Apr 16 22:13:33.544050 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.544012 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:33.544839 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.544823 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:33.544902 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.544851 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:33.544902 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.544862 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:33.546128 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.546115 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:33.546272 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.546259 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" Apr 16 22:13:33.546330 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.546290 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:33.546842 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.546827 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:33.546893 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.546857 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:33.546893 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.546867 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:33.546983 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.546832 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:33.546983 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.546944 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:33.546983 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.546956 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:33.548039 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.548026 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-191.ec2.internal" Apr 16 22:13:33.548090 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.548049 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:33.548649 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.548633 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:33.548649 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.548658 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:33.548753 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.548672 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:33.576537 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:33.576515 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-191.ec2.internal\" not found" node="ip-10-0-138-191.ec2.internal" Apr 16 22:13:33.580717 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:33.580701 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-191.ec2.internal\" not found" node="ip-10-0-138-191.ec2.internal" Apr 16 22:13:33.594694 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:33.594673 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-191.ec2.internal\" not found" Apr 16 22:13:33.695260 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:33.695181 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-191.ec2.internal\" not found" Apr 16 22:13:33.709603 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.709576 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7e9ac9a5ecee46125a37b1b7d1e8dc22-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal\" (UID: \"7e9ac9a5ecee46125a37b1b7d1e8dc22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" Apr 16 22:13:33.709720 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.709607 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e9ac9a5ecee46125a37b1b7d1e8dc22-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal\" (UID: \"7e9ac9a5ecee46125a37b1b7d1e8dc22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" Apr 16 22:13:33.709720 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.709624 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/09a45ec1566c454073ee33f001f99f61-config\") pod \"kube-apiserver-proxy-ip-10-0-138-191.ec2.internal\" (UID: \"09a45ec1566c454073ee33f001f99f61\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-191.ec2.internal" Apr 16 22:13:33.796030 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:33.796000 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-191.ec2.internal\" not found" Apr 16 22:13:33.810380 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.810307 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7e9ac9a5ecee46125a37b1b7d1e8dc22-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal\" (UID: \"7e9ac9a5ecee46125a37b1b7d1e8dc22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" Apr 16 22:13:33.810441 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.810410 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e9ac9a5ecee46125a37b1b7d1e8dc22-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal\" (UID: \"7e9ac9a5ecee46125a37b1b7d1e8dc22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" Apr 16 22:13:33.810441 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.810430 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/09a45ec1566c454073ee33f001f99f61-config\") pod \"kube-apiserver-proxy-ip-10-0-138-191.ec2.internal\" (UID: \"09a45ec1566c454073ee33f001f99f61\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-191.ec2.internal" Apr 16 22:13:33.810508 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.810362 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7e9ac9a5ecee46125a37b1b7d1e8dc22-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal\" (UID: \"7e9ac9a5ecee46125a37b1b7d1e8dc22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" Apr 16 22:13:33.810508 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.810471 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/09a45ec1566c454073ee33f001f99f61-config\") pod \"kube-apiserver-proxy-ip-10-0-138-191.ec2.internal\" (UID: \"09a45ec1566c454073ee33f001f99f61\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-191.ec2.internal" Apr 16 22:13:33.810508 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.810477 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e9ac9a5ecee46125a37b1b7d1e8dc22-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal\" (UID: \"7e9ac9a5ecee46125a37b1b7d1e8dc22\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" Apr 16 22:13:33.878490 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.878452 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" Apr 16 22:13:33.883013 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:33.882995 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-191.ec2.internal" Apr 16 22:13:33.897010 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:33.896986 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-191.ec2.internal\" not found" Apr 16 22:13:33.997532 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:33.997448 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-191.ec2.internal\" not found" Apr 16 22:13:34.098015 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:34.097989 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-191.ec2.internal\" not found" Apr 16 22:13:34.196489 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:34.196461 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 22:13:34.197095 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:34.196596 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:34.197095 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:34.196640 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:34.198602 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:34.198584 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-191.ec2.internal\" not found" Apr 16 22:13:34.298288 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:34.298201 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 22:08:33 +0000 UTC" deadline="2027-12-03 18:59:55.912254492 +0000 UTC" Apr 16 22:13:34.298288 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:34.298237 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14300h46m21.614020088s" Apr 16 22:13:34.299423 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:34.299404 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-191.ec2.internal\" not found" Apr 16 22:13:34.307086 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:34.307068 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:34.317425 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:34.317409 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:34.336335 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:34.336314 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-hgt77" Apr 16 22:13:34.343588 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:34.343570 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-hgt77" Apr 16 22:13:34.399832 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:34.399807 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-191.ec2.internal\" not found" Apr 16 22:13:34.469119 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:34.469076 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e9ac9a5ecee46125a37b1b7d1e8dc22.slice/crio-857df1cc6d46462eacff237b181ff713f48a6131eec4911c40bd1232249e4423 WatchSource:0}: Error finding container 857df1cc6d46462eacff237b181ff713f48a6131eec4911c40bd1232249e4423: Status 404 returned error can't find the container with id 857df1cc6d46462eacff237b181ff713f48a6131eec4911c40bd1232249e4423 Apr 16 22:13:34.469512 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:34.469485 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09a45ec1566c454073ee33f001f99f61.slice/crio-6a628563547a6afa8e788daad9398f614d7d1c230b247e52b8e5621558f37e0d WatchSource:0}: Error finding container 6a628563547a6afa8e788daad9398f614d7d1c230b247e52b8e5621558f37e0d: Status 404 returned error can't find the container with id 6a628563547a6afa8e788daad9398f614d7d1c230b247e52b8e5621558f37e0d Apr 16 22:13:34.473044 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:34.473026 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:13:34.500405 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:34.500383 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-191.ec2.internal\" not found" Apr 16 22:13:34.600980 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:34.600907 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-191.ec2.internal\" not found" Apr 16 22:13:34.678634 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:34.678609 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:34.707469 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:34.707447 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" Apr 16 22:13:34.718459 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:34.718439 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:34.720142 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:34.720130 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-191.ec2.internal" Apr 16 22:13:34.728711 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:34.728696 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:34.774100 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:34.774079 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:35.277262 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.277230 2574 apiserver.go:52] "Watching apiserver" Apr 16 22:13:35.286757 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.286732 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 22:13:35.287210 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.287177 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-jhsst","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6","openshift-cluster-node-tuning-operator/tuned-lhbpr","openshift-dns/node-resolver-dm8zf","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal","openshift-multus/multus-additional-cni-plugins-5pd4l","openshift-network-operator/iptables-alerter-gssjh","openshift-ovn-kubernetes/ovnkube-node-z2ds7","kube-system/kube-apiserver-proxy-ip-10-0-138-191.ec2.internal","openshift-image-registry/node-ca-2grf8","openshift-multus/multus-wjxwh","openshift-multus/network-metrics-daemon-knqmk","openshift-network-diagnostics/network-check-target-4g7hv"] Apr 16 22:13:35.289087 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.289067 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gssjh" Apr 16 22:13:35.290289 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.290261 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.291565 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.291523 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:35.291687 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.291589 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.291747 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.291680 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-9sh2z\"" Apr 16 22:13:35.291747 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.291738 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 22:13:35.291986 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.291972 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:35.292506 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.292483 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 22:13:35.292605 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.292581 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-2mrwm\"" Apr 16 22:13:35.292736 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.292708 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 22:13:35.292805 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.292740 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 22:13:35.293912 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.293892 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-tqlck\"" Apr 16 22:13:35.294456 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.294181 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dm8zf" Apr 16 22:13:35.294456 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.294218 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:35.294456 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.294233 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:35.295709 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.295692 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.295816 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.295797 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jhsst" Apr 16 22:13:35.296094 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.296073 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 22:13:35.296226 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.296208 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-czb2r\"" Apr 16 22:13:35.296295 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.296077 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 22:13:35.297405 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.297211 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.297493 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.297477 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 22:13:35.298050 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.298031 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 22:13:35.298050 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.298043 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-k5xc6\"" Apr 16 22:13:35.298306 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.298254 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 22:13:35.298511 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.298415 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 22:13:35.298511 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.298430 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 22:13:35.298652 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.298561 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 22:13:35.298652 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.298633 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 22:13:35.298749 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.298663 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-fqgp6\"" Apr 16 22:13:35.298983 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.298967 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2grf8" Apr 16 22:13:35.299413 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.299398 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 22:13:35.299488 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.299444 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 22:13:35.299750 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.299733 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 22:13:35.299810 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.299769 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 22:13:35.300131 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.300115 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pppvr\"" Apr 16 22:13:35.300214 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.300172 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 22:13:35.300668 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.300650 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.300922 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.300903 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 22:13:35.301397 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.301376 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 22:13:35.301397 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.301394 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vpmb5\"" Apr 16 22:13:35.301543 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.301428 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 22:13:35.301543 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.301480 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 22:13:35.302965 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.302946 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 22:13:35.303175 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.303122 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:35.303426 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:35.303386 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knqmk" podUID="cc045530-7e0f-412e-98ba-915fe7aa6d22" Apr 16 22:13:35.303826 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.303807 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-lff65\"" Apr 16 22:13:35.305117 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.305084 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:13:35.305248 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:35.305226 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4g7hv" podUID="06497fd8-f35d-4fd4-b42b-13ff6ded57e8" Apr 16 22:13:35.309849 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.309832 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 22:13:35.318632 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.318605 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-host-var-lib-cni-bin\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.318727 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.318646 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-multus-conf-dir\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.318727 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.318672 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp55v\" (UniqueName: \"kubernetes.io/projected/cc0a50b2-d73b-40da-a946-11e81bed8282-kube-api-access-qp55v\") pod \"iptables-alerter-gssjh\" (UID: \"cc0a50b2-d73b-40da-a946-11e81bed8282\") " pod="openshift-network-operator/iptables-alerter-gssjh" Apr 16 22:13:35.318833 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.318723 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3fc696b3-b6dd-4382-8367-ced58e1b1bdd-etc-selinux\") pod \"aws-ebs-csi-driver-node-5pbv6\" (UID: \"3fc696b3-b6dd-4382-8367-ced58e1b1bdd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.318833 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.318775 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-run\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.318833 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.318803 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd7zx\" (UniqueName: \"kubernetes.io/projected/9614d8df-9bb5-4a22-a608-e18aa7fb1162-kube-api-access-bd7zx\") pod \"node-ca-2grf8\" (UID: \"9614d8df-9bb5-4a22-a608-e18aa7fb1162\") " pod="openshift-image-registry/node-ca-2grf8" Apr 16 22:13:35.318833 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.318827 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-cni-binary-copy\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.319042 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.318871 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86j2p\" (UniqueName: \"kubernetes.io/projected/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-kube-api-access-86j2p\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.319042 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.318909 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cc0a50b2-d73b-40da-a946-11e81bed8282-host-slash\") pod \"iptables-alerter-gssjh\" (UID: \"cc0a50b2-d73b-40da-a946-11e81bed8282\") " pod="openshift-network-operator/iptables-alerter-gssjh" Apr 16 22:13:35.319042 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.318950 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-etc-sysconfig\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.319042 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.318976 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-etc-sysctl-conf\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.319042 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319000 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-var-lib-kubelet\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.319042 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319023 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-systemd-units\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.319326 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319046 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-run-systemd\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.319326 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319070 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-host-run-k8s-cni-cncf-io\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.319326 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319093 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-host-var-lib-kubelet\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.319326 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319116 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-multus-daemon-config\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.319326 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319139 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/29ce4801-ff31-4651-98b4-aba09699b7b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.319326 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319199 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9614d8df-9bb5-4a22-a608-e18aa7fb1162-serviceca\") pod \"node-ca-2grf8\" (UID: \"9614d8df-9bb5-4a22-a608-e18aa7fb1162\") " pod="openshift-image-registry/node-ca-2grf8" Apr 16 22:13:35.319326 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319257 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d2e4bc53-aead-430d-aaf8-6def343926ef-tmp-dir\") pod \"node-resolver-dm8zf\" (UID: \"d2e4bc53-aead-430d-aaf8-6def343926ef\") " pod="openshift-dns/node-resolver-dm8zf" Apr 16 22:13:35.319326 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319287 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3fc696b3-b6dd-4382-8367-ced58e1b1bdd-registration-dir\") pod \"aws-ebs-csi-driver-node-5pbv6\" (UID: \"3fc696b3-b6dd-4382-8367-ced58e1b1bdd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.319326 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319319 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/32b9b798-09e2-4502-8f94-c5f194be68e3-tmp\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.319758 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319336 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/29ce4801-ff31-4651-98b4-aba09699b7b6-system-cni-dir\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.319758 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319364 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljktp\" (UniqueName: \"kubernetes.io/projected/29ce4801-ff31-4651-98b4-aba09699b7b6-kube-api-access-ljktp\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.319758 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319388 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-os-release\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.319758 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319432 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-multus-socket-dir-parent\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.319758 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319467 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/29ce4801-ff31-4651-98b4-aba09699b7b6-os-release\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.319758 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319494 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-host-run-netns\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.319758 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319571 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-var-lib-openvswitch\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.319758 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319670 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6970b263-be92-460e-92da-a049f7bdbafe-ovn-node-metrics-cert\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.319758 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319694 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0208e93a-b489-404a-8e48-d0d66d76793f-agent-certs\") pod \"konnectivity-agent-jhsst\" (UID: \"0208e93a-b489-404a-8e48-d0d66d76793f\") " pod="kube-system/konnectivity-agent-jhsst" Apr 16 22:13:35.319758 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319719 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-system-cni-dir\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.319758 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319760 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/cc0a50b2-d73b-40da-a946-11e81bed8282-iptables-alerter-script\") pod \"iptables-alerter-gssjh\" (UID: \"cc0a50b2-d73b-40da-a946-11e81bed8282\") " pod="openshift-network-operator/iptables-alerter-gssjh" Apr 16 22:13:35.320549 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319798 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3fc696b3-b6dd-4382-8367-ced58e1b1bdd-sys-fs\") pod \"aws-ebs-csi-driver-node-5pbv6\" (UID: \"3fc696b3-b6dd-4382-8367-ced58e1b1bdd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.320549 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319822 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-etc-kubernetes\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.320549 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319847 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/29ce4801-ff31-4651-98b4-aba09699b7b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.320549 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.319905 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-host-cni-netd\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.320549 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.320008 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-multus-cni-dir\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.320549 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.320037 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-host-var-lib-cni-multus\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.320549 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.320061 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8s7t\" (UniqueName: \"kubernetes.io/projected/cc045530-7e0f-412e-98ba-915fe7aa6d22-kube-api-access-h8s7t\") pod \"network-metrics-daemon-knqmk\" (UID: \"cc045530-7e0f-412e-98ba-915fe7aa6d22\") " pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:35.320549 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.320124 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-etc-sysctl-d\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.320549 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.320145 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-lib-modules\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.320549 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.320180 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/32b9b798-09e2-4502-8f94-c5f194be68e3-etc-tuned\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.320549 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.320215 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxnqx\" (UniqueName: \"kubernetes.io/projected/32b9b798-09e2-4502-8f94-c5f194be68e3-kube-api-access-mxnqx\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.320549 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.320250 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6970b263-be92-460e-92da-a049f7bdbafe-ovnkube-script-lib\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.320549 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.320279 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9614d8df-9bb5-4a22-a608-e18aa7fb1162-host\") pod \"node-ca-2grf8\" (UID: \"9614d8df-9bb5-4a22-a608-e18aa7fb1162\") " pod="openshift-image-registry/node-ca-2grf8" Apr 16 22:13:35.320549 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.320307 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs\") pod \"network-metrics-daemon-knqmk\" (UID: \"cc045530-7e0f-412e-98ba-915fe7aa6d22\") " pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:35.320549 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.320336 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-host\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.320549 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.320369 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/29ce4801-ff31-4651-98b4-aba09699b7b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.321369 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.320419 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-node-log\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.321369 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.320479 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-log-socket\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.321369 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.320517 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0208e93a-b489-404a-8e48-d0d66d76793f-konnectivity-ca\") pod \"konnectivity-agent-jhsst\" (UID: \"0208e93a-b489-404a-8e48-d0d66d76793f\") " pod="kube-system/konnectivity-agent-jhsst" Apr 16 22:13:35.321369 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.320553 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-host-run-netns\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.321369 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.320577 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-hostroot\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.321369 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.320601 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdkht\" (UniqueName: \"kubernetes.io/projected/06497fd8-f35d-4fd4-b42b-13ff6ded57e8-kube-api-access-mdkht\") pod \"network-check-target-4g7hv\" (UID: \"06497fd8-f35d-4fd4-b42b-13ff6ded57e8\") " pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:13:35.321369 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.320846 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-host-slash\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.321369 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.320891 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-host-cni-bin\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.321369 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.320920 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctl7g\" (UniqueName: \"kubernetes.io/projected/d2e4bc53-aead-430d-aaf8-6def343926ef-kube-api-access-ctl7g\") pod \"node-resolver-dm8zf\" (UID: \"d2e4bc53-aead-430d-aaf8-6def343926ef\") " pod="openshift-dns/node-resolver-dm8zf" Apr 16 22:13:35.321369 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.320968 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-host-run-ovn-kubernetes\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.321369 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.321049 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3fc696b3-b6dd-4382-8367-ced58e1b1bdd-socket-dir\") pod \"aws-ebs-csi-driver-node-5pbv6\" (UID: \"3fc696b3-b6dd-4382-8367-ced58e1b1bdd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.321369 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.321073 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-etc-kubernetes\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.321369 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.321095 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3fc696b3-b6dd-4382-8367-ced58e1b1bdd-device-dir\") pod \"aws-ebs-csi-driver-node-5pbv6\" (UID: \"3fc696b3-b6dd-4382-8367-ced58e1b1bdd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.321369 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.321116 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-etc-modprobe-d\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.321369 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.321140 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-host-kubelet\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.321369 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.321220 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-run-ovn\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.322145 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.321236 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.322145 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.321259 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6970b263-be92-460e-92da-a049f7bdbafe-env-overrides\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.322145 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.321292 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d2e4bc53-aead-430d-aaf8-6def343926ef-hosts-file\") pod \"node-resolver-dm8zf\" (UID: \"d2e4bc53-aead-430d-aaf8-6def343926ef\") " pod="openshift-dns/node-resolver-dm8zf" Apr 16 22:13:35.322145 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.321318 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-cnibin\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.322145 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.321339 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6d9q\" (UniqueName: \"kubernetes.io/projected/3fc696b3-b6dd-4382-8367-ced58e1b1bdd-kube-api-access-v6d9q\") pod \"aws-ebs-csi-driver-node-5pbv6\" (UID: \"3fc696b3-b6dd-4382-8367-ced58e1b1bdd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.322145 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.321379 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/29ce4801-ff31-4651-98b4-aba09699b7b6-cnibin\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.322145 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.321413 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk2bn\" (UniqueName: \"kubernetes.io/projected/6970b263-be92-460e-92da-a049f7bdbafe-kube-api-access-mk2bn\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.322145 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.321440 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-host-run-multus-certs\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.322145 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.321456 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3fc696b3-b6dd-4382-8367-ced58e1b1bdd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5pbv6\" (UID: \"3fc696b3-b6dd-4382-8367-ced58e1b1bdd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.322145 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.321478 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-sys\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.322145 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.321508 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/29ce4801-ff31-4651-98b4-aba09699b7b6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.322145 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.321524 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-etc-openvswitch\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.322145 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.321546 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-run-openvswitch\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.322145 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.321569 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6970b263-be92-460e-92da-a049f7bdbafe-ovnkube-config\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.322145 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.321584 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-etc-systemd\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.345690 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.345566 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:34 +0000 UTC" deadline="2028-01-19 09:12:00.707775692 +0000 UTC" Apr 16 22:13:35.345690 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.345664 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15418h58m25.36211659s" Apr 16 22:13:35.421882 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.421847 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/32b9b798-09e2-4502-8f94-c5f194be68e3-tmp\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.422064 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.421891 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/29ce4801-ff31-4651-98b4-aba09699b7b6-system-cni-dir\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.422064 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.421910 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljktp\" (UniqueName: \"kubernetes.io/projected/29ce4801-ff31-4651-98b4-aba09699b7b6-kube-api-access-ljktp\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.422064 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.421947 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-os-release\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.422064 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.421971 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-multus-socket-dir-parent\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.422064 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.421989 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/29ce4801-ff31-4651-98b4-aba09699b7b6-system-cni-dir\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.422064 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.421994 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/29ce4801-ff31-4651-98b4-aba09699b7b6-os-release\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.422064 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422040 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-host-run-netns\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.422064 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422052 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/29ce4801-ff31-4651-98b4-aba09699b7b6-os-release\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.422064 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422053 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-os-release\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.422064 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422065 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-var-lib-openvswitch\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.422592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422105 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-host-run-netns\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.422592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422135 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6970b263-be92-460e-92da-a049f7bdbafe-ovn-node-metrics-cert\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.422592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422142 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-multus-socket-dir-parent\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.422592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422165 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0208e93a-b489-404a-8e48-d0d66d76793f-agent-certs\") pod \"konnectivity-agent-jhsst\" (UID: \"0208e93a-b489-404a-8e48-d0d66d76793f\") " pod="kube-system/konnectivity-agent-jhsst" Apr 16 22:13:35.422592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422192 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-system-cni-dir\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.422592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422195 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-var-lib-openvswitch\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.422592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422216 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/cc0a50b2-d73b-40da-a946-11e81bed8282-iptables-alerter-script\") pod \"iptables-alerter-gssjh\" (UID: \"cc0a50b2-d73b-40da-a946-11e81bed8282\") " pod="openshift-network-operator/iptables-alerter-gssjh" Apr 16 22:13:35.422592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422241 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3fc696b3-b6dd-4382-8367-ced58e1b1bdd-sys-fs\") pod \"aws-ebs-csi-driver-node-5pbv6\" (UID: \"3fc696b3-b6dd-4382-8367-ced58e1b1bdd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.422592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422247 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-system-cni-dir\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.422592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422266 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-etc-kubernetes\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.422592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422290 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/29ce4801-ff31-4651-98b4-aba09699b7b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.422592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422316 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-host-cni-netd\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.422592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422319 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 22:13:35.422592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422377 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3fc696b3-b6dd-4382-8367-ced58e1b1bdd-sys-fs\") pod \"aws-ebs-csi-driver-node-5pbv6\" (UID: \"3fc696b3-b6dd-4382-8367-ced58e1b1bdd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.422592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422434 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-etc-kubernetes\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.422592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422464 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-host-cni-netd\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.422592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422489 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-multus-cni-dir\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.422592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422516 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-host-var-lib-cni-multus\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.423534 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422540 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8s7t\" (UniqueName: \"kubernetes.io/projected/cc045530-7e0f-412e-98ba-915fe7aa6d22-kube-api-access-h8s7t\") pod \"network-metrics-daemon-knqmk\" (UID: \"cc045530-7e0f-412e-98ba-915fe7aa6d22\") " pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:35.423534 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422567 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-etc-sysctl-d\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.423534 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422590 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-lib-modules\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.423534 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422611 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/32b9b798-09e2-4502-8f94-c5f194be68e3-etc-tuned\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.423534 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422634 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxnqx\" (UniqueName: \"kubernetes.io/projected/32b9b798-09e2-4502-8f94-c5f194be68e3-kube-api-access-mxnqx\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.423534 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422650 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6970b263-be92-460e-92da-a049f7bdbafe-ovnkube-script-lib\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.423534 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422664 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9614d8df-9bb5-4a22-a608-e18aa7fb1162-host\") pod \"node-ca-2grf8\" (UID: \"9614d8df-9bb5-4a22-a608-e18aa7fb1162\") " pod="openshift-image-registry/node-ca-2grf8" Apr 16 22:13:35.423534 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422678 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs\") pod \"network-metrics-daemon-knqmk\" (UID: \"cc045530-7e0f-412e-98ba-915fe7aa6d22\") " pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:35.423534 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422695 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-host\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.423534 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422722 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/29ce4801-ff31-4651-98b4-aba09699b7b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.423534 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422786 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-node-log\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.423534 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422806 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-log-socket\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.423534 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422819 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/29ce4801-ff31-4651-98b4-aba09699b7b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.423534 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422827 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0208e93a-b489-404a-8e48-d0d66d76793f-konnectivity-ca\") pod \"konnectivity-agent-jhsst\" (UID: \"0208e93a-b489-404a-8e48-d0d66d76793f\") " pod="kube-system/konnectivity-agent-jhsst" Apr 16 22:13:35.423534 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422849 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-host-run-netns\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.423534 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422874 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/cc0a50b2-d73b-40da-a946-11e81bed8282-iptables-alerter-script\") pod \"iptables-alerter-gssjh\" (UID: \"cc0a50b2-d73b-40da-a946-11e81bed8282\") " pod="openshift-network-operator/iptables-alerter-gssjh" Apr 16 22:13:35.423534 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422904 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-multus-cni-dir\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.424411 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422910 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-hostroot\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.424411 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422955 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9614d8df-9bb5-4a22-a608-e18aa7fb1162-host\") pod \"node-ca-2grf8\" (UID: \"9614d8df-9bb5-4a22-a608-e18aa7fb1162\") " pod="openshift-image-registry/node-ca-2grf8" Apr 16 22:13:35.424411 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:35.423037 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:35.424411 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423038 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-lib-modules\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.424411 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.422879 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-hostroot\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.424411 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:35.423123 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs podName:cc045530-7e0f-412e-98ba-915fe7aa6d22 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:35.923080365 +0000 UTC m=+3.165896405 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs") pod "network-metrics-daemon-knqmk" (UID: "cc045530-7e0f-412e-98ba-915fe7aa6d22") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:35.424411 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423147 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdkht\" (UniqueName: \"kubernetes.io/projected/06497fd8-f35d-4fd4-b42b-13ff6ded57e8-kube-api-access-mdkht\") pod \"network-check-target-4g7hv\" (UID: \"06497fd8-f35d-4fd4-b42b-13ff6ded57e8\") " pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:13:35.424411 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423157 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-host-var-lib-cni-multus\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.424411 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423174 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-host-slash\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.424411 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423200 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-host-cni-bin\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.424411 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423225 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctl7g\" (UniqueName: \"kubernetes.io/projected/d2e4bc53-aead-430d-aaf8-6def343926ef-kube-api-access-ctl7g\") pod \"node-resolver-dm8zf\" (UID: \"d2e4bc53-aead-430d-aaf8-6def343926ef\") " pod="openshift-dns/node-resolver-dm8zf" Apr 16 22:13:35.424411 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423252 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-host-run-ovn-kubernetes\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.424411 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423265 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-etc-sysctl-d\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.424411 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423275 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3fc696b3-b6dd-4382-8367-ced58e1b1bdd-socket-dir\") pod \"aws-ebs-csi-driver-node-5pbv6\" (UID: \"3fc696b3-b6dd-4382-8367-ced58e1b1bdd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.424411 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423300 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-etc-kubernetes\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.424411 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423322 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3fc696b3-b6dd-4382-8367-ced58e1b1bdd-device-dir\") pod \"aws-ebs-csi-driver-node-5pbv6\" (UID: \"3fc696b3-b6dd-4382-8367-ced58e1b1bdd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.424411 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423345 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-etc-modprobe-d\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.425219 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423370 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-host-kubelet\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.425219 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423393 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-run-ovn\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.425219 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423415 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.425219 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423440 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6970b263-be92-460e-92da-a049f7bdbafe-env-overrides\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.425219 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423451 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6970b263-be92-460e-92da-a049f7bdbafe-ovnkube-script-lib\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.425219 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423466 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d2e4bc53-aead-430d-aaf8-6def343926ef-hosts-file\") pod \"node-resolver-dm8zf\" (UID: \"d2e4bc53-aead-430d-aaf8-6def343926ef\") " pod="openshift-dns/node-resolver-dm8zf" Apr 16 22:13:35.425219 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423473 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-host\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.425219 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423493 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-cnibin\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.425219 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423499 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-host-run-netns\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.425219 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423523 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6d9q\" (UniqueName: \"kubernetes.io/projected/3fc696b3-b6dd-4382-8367-ced58e1b1bdd-kube-api-access-v6d9q\") pod \"aws-ebs-csi-driver-node-5pbv6\" (UID: \"3fc696b3-b6dd-4382-8367-ced58e1b1bdd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.425219 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423580 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/29ce4801-ff31-4651-98b4-aba09699b7b6-cnibin\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.425219 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423608 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mk2bn\" (UniqueName: \"kubernetes.io/projected/6970b263-be92-460e-92da-a049f7bdbafe-kube-api-access-mk2bn\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.425219 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423637 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-host-run-multus-certs\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.425219 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423667 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3fc696b3-b6dd-4382-8367-ced58e1b1bdd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5pbv6\" (UID: \"3fc696b3-b6dd-4382-8367-ced58e1b1bdd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.425219 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423700 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-sys\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.425219 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423725 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/29ce4801-ff31-4651-98b4-aba09699b7b6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.425219 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423771 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-etc-openvswitch\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.426023 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423797 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-run-openvswitch\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.426023 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423823 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6970b263-be92-460e-92da-a049f7bdbafe-ovnkube-config\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.426023 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423847 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-etc-systemd\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.426023 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423872 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-host-var-lib-cni-bin\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.426023 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423892 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-host-slash\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.426023 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423901 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-multus-conf-dir\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.426023 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423943 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/29ce4801-ff31-4651-98b4-aba09699b7b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.426023 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423963 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-multus-conf-dir\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.426023 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423970 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-host-kubelet\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.426023 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.423996 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qp55v\" (UniqueName: \"kubernetes.io/projected/cc0a50b2-d73b-40da-a946-11e81bed8282-kube-api-access-qp55v\") pod \"iptables-alerter-gssjh\" (UID: \"cc0a50b2-d73b-40da-a946-11e81bed8282\") " pod="openshift-network-operator/iptables-alerter-gssjh" Apr 16 22:13:35.426023 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424010 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-run-ovn\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.426023 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424010 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-node-log\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.426023 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424025 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3fc696b3-b6dd-4382-8367-ced58e1b1bdd-etc-selinux\") pod \"aws-ebs-csi-driver-node-5pbv6\" (UID: \"3fc696b3-b6dd-4382-8367-ced58e1b1bdd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.426023 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424016 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-etc-modprobe-d\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.426023 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424058 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-run\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.426023 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424060 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-host-cni-bin\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.426023 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424108 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-log-socket\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.426023 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424120 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3fc696b3-b6dd-4382-8367-ced58e1b1bdd-etc-selinux\") pod \"aws-ebs-csi-driver-node-5pbv6\" (UID: \"3fc696b3-b6dd-4382-8367-ced58e1b1bdd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.426809 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424156 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/29ce4801-ff31-4651-98b4-aba09699b7b6-cnibin\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.426809 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424236 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bd7zx\" (UniqueName: \"kubernetes.io/projected/9614d8df-9bb5-4a22-a608-e18aa7fb1162-kube-api-access-bd7zx\") pod \"node-ca-2grf8\" (UID: \"9614d8df-9bb5-4a22-a608-e18aa7fb1162\") " pod="openshift-image-registry/node-ca-2grf8" Apr 16 22:13:35.426809 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424277 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-cni-binary-copy\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.426809 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424304 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86j2p\" (UniqueName: \"kubernetes.io/projected/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-kube-api-access-86j2p\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.426809 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424322 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-host-run-multus-certs\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.426809 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424329 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cc0a50b2-d73b-40da-a946-11e81bed8282-host-slash\") pod \"iptables-alerter-gssjh\" (UID: \"cc0a50b2-d73b-40da-a946-11e81bed8282\") " pod="openshift-network-operator/iptables-alerter-gssjh" Apr 16 22:13:35.426809 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424364 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-etc-sysconfig\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.426809 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424372 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cc0a50b2-d73b-40da-a946-11e81bed8282-host-slash\") pod \"iptables-alerter-gssjh\" (UID: \"cc0a50b2-d73b-40da-a946-11e81bed8282\") " pod="openshift-network-operator/iptables-alerter-gssjh" Apr 16 22:13:35.426809 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424393 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-etc-sysctl-conf\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.426809 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424416 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-etc-kubernetes\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.426809 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424419 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-var-lib-kubelet\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.426809 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424424 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.426809 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424460 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-systemd-units\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.426809 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424471 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-var-lib-kubelet\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.426809 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424487 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-run-systemd\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.426809 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424494 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0208e93a-b489-404a-8e48-d0d66d76793f-konnectivity-ca\") pod \"konnectivity-agent-jhsst\" (UID: \"0208e93a-b489-404a-8e48-d0d66d76793f\") " pod="kube-system/konnectivity-agent-jhsst" Apr 16 22:13:35.426809 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424515 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-host-run-k8s-cni-cncf-io\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.427589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424520 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3fc696b3-b6dd-4382-8367-ced58e1b1bdd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5pbv6\" (UID: \"3fc696b3-b6dd-4382-8367-ced58e1b1bdd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.427589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424516 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d2e4bc53-aead-430d-aaf8-6def343926ef-hosts-file\") pod \"node-resolver-dm8zf\" (UID: \"d2e4bc53-aead-430d-aaf8-6def343926ef\") " pod="openshift-dns/node-resolver-dm8zf" Apr 16 22:13:35.427589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424551 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-cnibin\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.427589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424561 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-run-systemd\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.427589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424582 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3fc696b3-b6dd-4382-8367-ced58e1b1bdd-device-dir\") pod \"aws-ebs-csi-driver-node-5pbv6\" (UID: \"3fc696b3-b6dd-4382-8367-ced58e1b1bdd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.427589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424578 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-host-var-lib-kubelet\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.427589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424610 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-host-var-lib-kubelet\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.427589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424618 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-multus-daemon-config\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.427589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424625 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-host-run-k8s-cni-cncf-io\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.427589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424645 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/29ce4801-ff31-4651-98b4-aba09699b7b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.427589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424650 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-systemd-units\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.427589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424670 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9614d8df-9bb5-4a22-a608-e18aa7fb1162-serviceca\") pod \"node-ca-2grf8\" (UID: \"9614d8df-9bb5-4a22-a608-e18aa7fb1162\") " pod="openshift-image-registry/node-ca-2grf8" Apr 16 22:13:35.427589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424688 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-etc-openvswitch\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.427589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424695 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d2e4bc53-aead-430d-aaf8-6def343926ef-tmp-dir\") pod \"node-resolver-dm8zf\" (UID: \"d2e4bc53-aead-430d-aaf8-6def343926ef\") " pod="openshift-dns/node-resolver-dm8zf" Apr 16 22:13:35.427589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424720 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3fc696b3-b6dd-4382-8367-ced58e1b1bdd-registration-dir\") pod \"aws-ebs-csi-driver-node-5pbv6\" (UID: \"3fc696b3-b6dd-4382-8367-ced58e1b1bdd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.427589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424722 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-run-openvswitch\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.427589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424728 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3fc696b3-b6dd-4382-8367-ced58e1b1bdd-socket-dir\") pod \"aws-ebs-csi-driver-node-5pbv6\" (UID: \"3fc696b3-b6dd-4382-8367-ced58e1b1bdd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.427589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424807 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-etc-sysconfig\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.428402 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424816 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3fc696b3-b6dd-4382-8367-ced58e1b1bdd-registration-dir\") pod \"aws-ebs-csi-driver-node-5pbv6\" (UID: \"3fc696b3-b6dd-4382-8367-ced58e1b1bdd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.428402 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424869 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-etc-sysctl-conf\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.428402 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.425117 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d2e4bc53-aead-430d-aaf8-6def343926ef-tmp-dir\") pod \"node-resolver-dm8zf\" (UID: \"d2e4bc53-aead-430d-aaf8-6def343926ef\") " pod="openshift-dns/node-resolver-dm8zf" Apr 16 22:13:35.428402 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.425257 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/29ce4801-ff31-4651-98b4-aba09699b7b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.428402 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.425363 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-cni-binary-copy\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.428402 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.425965 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/32b9b798-09e2-4502-8f94-c5f194be68e3-etc-tuned\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.428402 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.424369 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6970b263-be92-460e-92da-a049f7bdbafe-host-run-ovn-kubernetes\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.428402 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.426279 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6970b263-be92-460e-92da-a049f7bdbafe-ovn-node-metrics-cert\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.428402 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.426334 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0208e93a-b489-404a-8e48-d0d66d76793f-agent-certs\") pod \"konnectivity-agent-jhsst\" (UID: \"0208e93a-b489-404a-8e48-d0d66d76793f\") " pod="kube-system/konnectivity-agent-jhsst" Apr 16 22:13:35.428402 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.426398 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-run\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.428402 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.426441 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-etc-systemd\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.428402 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.426453 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/29ce4801-ff31-4651-98b4-aba09699b7b6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.428402 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.426505 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/32b9b798-09e2-4502-8f94-c5f194be68e3-sys\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.428402 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.426534 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-multus-daemon-config\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.428402 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.426547 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-host-var-lib-cni-bin\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.428402 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.426734 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6970b263-be92-460e-92da-a049f7bdbafe-env-overrides\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.428402 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.426775 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/32b9b798-09e2-4502-8f94-c5f194be68e3-tmp\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.428402 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.426981 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9614d8df-9bb5-4a22-a608-e18aa7fb1162-serviceca\") pod \"node-ca-2grf8\" (UID: \"9614d8df-9bb5-4a22-a608-e18aa7fb1162\") " pod="openshift-image-registry/node-ca-2grf8" Apr 16 22:13:35.429280 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.427347 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6970b263-be92-460e-92da-a049f7bdbafe-ovnkube-config\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.433888 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.433862 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljktp\" (UniqueName: \"kubernetes.io/projected/29ce4801-ff31-4651-98b4-aba09699b7b6-kube-api-access-ljktp\") pod \"multus-additional-cni-plugins-5pd4l\" (UID: \"29ce4801-ff31-4651-98b4-aba09699b7b6\") " pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.434164 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.434111 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8s7t\" (UniqueName: \"kubernetes.io/projected/cc045530-7e0f-412e-98ba-915fe7aa6d22-kube-api-access-h8s7t\") pod \"network-metrics-daemon-knqmk\" (UID: \"cc045530-7e0f-412e-98ba-915fe7aa6d22\") " pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:35.434833 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.434793 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86j2p\" (UniqueName: \"kubernetes.io/projected/6aaeb270-2bd7-4647-889b-36ff3ceba5cf-kube-api-access-86j2p\") pod \"multus-wjxwh\" (UID: \"6aaeb270-2bd7-4647-889b-36ff3ceba5cf\") " pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.435370 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.435349 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctl7g\" (UniqueName: \"kubernetes.io/projected/d2e4bc53-aead-430d-aaf8-6def343926ef-kube-api-access-ctl7g\") pod \"node-resolver-dm8zf\" (UID: \"d2e4bc53-aead-430d-aaf8-6def343926ef\") " pod="openshift-dns/node-resolver-dm8zf" Apr 16 22:13:35.436880 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:35.436540 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:35.436880 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:35.436564 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:35.436880 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:35.436578 2574 projected.go:194] Error preparing data for projected volume kube-api-access-mdkht for pod openshift-network-diagnostics/network-check-target-4g7hv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:35.436880 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:35.436637 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/06497fd8-f35d-4fd4-b42b-13ff6ded57e8-kube-api-access-mdkht podName:06497fd8-f35d-4fd4-b42b-13ff6ded57e8 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:35.936620126 +0000 UTC m=+3.179436176 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mdkht" (UniqueName: "kubernetes.io/projected/06497fd8-f35d-4fd4-b42b-13ff6ded57e8-kube-api-access-mdkht") pod "network-check-target-4g7hv" (UID: "06497fd8-f35d-4fd4-b42b-13ff6ded57e8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:35.438621 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.438284 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6d9q\" (UniqueName: \"kubernetes.io/projected/3fc696b3-b6dd-4382-8367-ced58e1b1bdd-kube-api-access-v6d9q\") pod \"aws-ebs-csi-driver-node-5pbv6\" (UID: \"3fc696b3-b6dd-4382-8367-ced58e1b1bdd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.438717 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.438695 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp55v\" (UniqueName: \"kubernetes.io/projected/cc0a50b2-d73b-40da-a946-11e81bed8282-kube-api-access-qp55v\") pod \"iptables-alerter-gssjh\" (UID: \"cc0a50b2-d73b-40da-a946-11e81bed8282\") " pod="openshift-network-operator/iptables-alerter-gssjh" Apr 16 22:13:35.439304 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.439280 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk2bn\" (UniqueName: \"kubernetes.io/projected/6970b263-be92-460e-92da-a049f7bdbafe-kube-api-access-mk2bn\") pod \"ovnkube-node-z2ds7\" (UID: \"6970b263-be92-460e-92da-a049f7bdbafe\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.439432 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.439413 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxnqx\" (UniqueName: \"kubernetes.io/projected/32b9b798-09e2-4502-8f94-c5f194be68e3-kube-api-access-mxnqx\") pod \"tuned-lhbpr\" (UID: \"32b9b798-09e2-4502-8f94-c5f194be68e3\") " pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.439776 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.439760 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd7zx\" (UniqueName: \"kubernetes.io/projected/9614d8df-9bb5-4a22-a608-e18aa7fb1162-kube-api-access-bd7zx\") pod \"node-ca-2grf8\" (UID: \"9614d8df-9bb5-4a22-a608-e18aa7fb1162\") " pod="openshift-image-registry/node-ca-2grf8" Apr 16 22:13:35.447665 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.447627 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-191.ec2.internal" event={"ID":"09a45ec1566c454073ee33f001f99f61","Type":"ContainerStarted","Data":"6a628563547a6afa8e788daad9398f614d7d1c230b247e52b8e5621558f37e0d"} Apr 16 22:13:35.448595 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.448572 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" event={"ID":"7e9ac9a5ecee46125a37b1b7d1e8dc22","Type":"ContainerStarted","Data":"857df1cc6d46462eacff237b181ff713f48a6131eec4911c40bd1232249e4423"} Apr 16 22:13:35.586315 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.586252 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:35.602186 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.602152 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gssjh" Apr 16 22:13:35.609874 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.609851 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" Apr 16 22:13:35.613521 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.613505 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:35.616832 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.616812 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" Apr 16 22:13:35.623361 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.623344 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dm8zf" Apr 16 22:13:35.630504 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.630482 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5pd4l" Apr 16 22:13:35.636116 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.636099 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jhsst" Apr 16 22:13:35.641586 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.641568 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:35.649069 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.649048 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2grf8" Apr 16 22:13:35.662622 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.662601 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wjxwh" Apr 16 22:13:35.927351 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:35.927323 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs\") pod \"network-metrics-daemon-knqmk\" (UID: \"cc045530-7e0f-412e-98ba-915fe7aa6d22\") " pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:35.927535 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:35.927465 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:35.927535 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:35.927521 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs podName:cc045530-7e0f-412e-98ba-915fe7aa6d22 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:36.927508079 +0000 UTC m=+4.170324123 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs") pod "network-metrics-daemon-knqmk" (UID: "cc045530-7e0f-412e-98ba-915fe7aa6d22") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:36.027919 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:36.027880 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdkht\" (UniqueName: \"kubernetes.io/projected/06497fd8-f35d-4fd4-b42b-13ff6ded57e8-kube-api-access-mdkht\") pod \"network-check-target-4g7hv\" (UID: \"06497fd8-f35d-4fd4-b42b-13ff6ded57e8\") " pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:13:36.028120 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:36.028054 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:36.028120 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:36.028080 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:36.028120 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:36.028095 2574 projected.go:194] Error preparing data for projected volume kube-api-access-mdkht for pod openshift-network-diagnostics/network-check-target-4g7hv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:36.028272 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:36.028160 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/06497fd8-f35d-4fd4-b42b-13ff6ded57e8-kube-api-access-mdkht podName:06497fd8-f35d-4fd4-b42b-13ff6ded57e8 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:37.028140495 +0000 UTC m=+4.270956552 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-mdkht" (UniqueName: "kubernetes.io/projected/06497fd8-f35d-4fd4-b42b-13ff6ded57e8-kube-api-access-mdkht") pod "network-check-target-4g7hv" (UID: "06497fd8-f35d-4fd4-b42b-13ff6ded57e8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:36.099639 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:36.099608 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6aaeb270_2bd7_4647_889b_36ff3ceba5cf.slice/crio-b65df7aa64799f34814f51b68df9dd45797bc73161cf9e068513481028f2ba84 WatchSource:0}: Error finding container b65df7aa64799f34814f51b68df9dd45797bc73161cf9e068513481028f2ba84: Status 404 returned error can't find the container with id b65df7aa64799f34814f51b68df9dd45797bc73161cf9e068513481028f2ba84 Apr 16 22:13:36.102123 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:36.102026 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fc696b3_b6dd_4382_8367_ced58e1b1bdd.slice/crio-67a79475c3cf3d106b762a47ad8b08e03f34e71ccfc773f65d093e0c965a21ef WatchSource:0}: Error finding container 67a79475c3cf3d106b762a47ad8b08e03f34e71ccfc773f65d093e0c965a21ef: Status 404 returned error can't find the container with id 67a79475c3cf3d106b762a47ad8b08e03f34e71ccfc773f65d093e0c965a21ef Apr 16 22:13:36.107000 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:36.106974 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0208e93a_b489_404a_8e48_d0d66d76793f.slice/crio-cf43ac075510422620f521966fa11605b234acc93fa74910d8e4dbf9f7faa77d WatchSource:0}: Error finding container cf43ac075510422620f521966fa11605b234acc93fa74910d8e4dbf9f7faa77d: Status 404 returned error can't find the container with id cf43ac075510422620f521966fa11605b234acc93fa74910d8e4dbf9f7faa77d Apr 16 22:13:36.107764 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:36.107739 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32b9b798_09e2_4502_8f94_c5f194be68e3.slice/crio-afe74ce64997425c59d59f9860f07373a05a8754da16e80c617ba676254f14e6 WatchSource:0}: Error finding container afe74ce64997425c59d59f9860f07373a05a8754da16e80c617ba676254f14e6: Status 404 returned error can't find the container with id afe74ce64997425c59d59f9860f07373a05a8754da16e80c617ba676254f14e6 Apr 16 22:13:36.108800 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:36.108773 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2e4bc53_aead_430d_aaf8_6def343926ef.slice/crio-a62a3b959a04b6e8264d49fdb09db472219c833dc756c8308482fd525076103f WatchSource:0}: Error finding container a62a3b959a04b6e8264d49fdb09db472219c833dc756c8308482fd525076103f: Status 404 returned error can't find the container with id a62a3b959a04b6e8264d49fdb09db472219c833dc756c8308482fd525076103f Apr 16 22:13:36.110158 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:36.110054 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc0a50b2_d73b_40da_a946_11e81bed8282.slice/crio-645780e3b90c3b47fc2f37755eb99ac40bebda1b36b61c2b567f65fa3d1f2cde WatchSource:0}: Error finding container 645780e3b90c3b47fc2f37755eb99ac40bebda1b36b61c2b567f65fa3d1f2cde: Status 404 returned error can't find the container with id 645780e3b90c3b47fc2f37755eb99ac40bebda1b36b61c2b567f65fa3d1f2cde Apr 16 22:13:36.111344 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:36.111295 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6970b263_be92_460e_92da_a049f7bdbafe.slice/crio-8cb5ba6a2034e9aea2136ccdc3ad33002c40974fe864f3d932afd1829d3c3c43 WatchSource:0}: Error finding container 8cb5ba6a2034e9aea2136ccdc3ad33002c40974fe864f3d932afd1829d3c3c43: Status 404 returned error can't find the container with id 8cb5ba6a2034e9aea2136ccdc3ad33002c40974fe864f3d932afd1829d3c3c43 Apr 16 22:13:36.113114 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:36.112186 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9614d8df_9bb5_4a22_a608_e18aa7fb1162.slice/crio-de96a7bddaf305ff50941abab4076445468e5afc85ca3250297b13fb3f42a0ba WatchSource:0}: Error finding container de96a7bddaf305ff50941abab4076445468e5afc85ca3250297b13fb3f42a0ba: Status 404 returned error can't find the container with id de96a7bddaf305ff50941abab4076445468e5afc85ca3250297b13fb3f42a0ba Apr 16 22:13:36.113268 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:13:36.113160 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29ce4801_ff31_4651_98b4_aba09699b7b6.slice/crio-ebb2928c257df51e145427ae0d893777178f35d99337ecbc15ff9e82d8080de5 WatchSource:0}: Error finding container ebb2928c257df51e145427ae0d893777178f35d99337ecbc15ff9e82d8080de5: Status 404 returned error can't find the container with id ebb2928c257df51e145427ae0d893777178f35d99337ecbc15ff9e82d8080de5 Apr 16 22:13:36.346910 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:36.346716 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:34 +0000 UTC" deadline="2027-12-11 10:39:35.299781347 +0000 UTC" Apr 16 22:13:36.346910 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:36.346902 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14484h25m58.952883261s" Apr 16 22:13:36.443977 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:36.443884 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:13:36.444125 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:36.444030 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4g7hv" podUID="06497fd8-f35d-4fd4-b42b-13ff6ded57e8" Apr 16 22:13:36.455249 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:36.455200 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-191.ec2.internal" event={"ID":"09a45ec1566c454073ee33f001f99f61","Type":"ContainerStarted","Data":"de9de1d799dd04004189284680aa3c0c32f100e6aaf75a0899a0a65c14bf2f28"} Apr 16 22:13:36.458376 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:36.458317 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5pd4l" event={"ID":"29ce4801-ff31-4651-98b4-aba09699b7b6","Type":"ContainerStarted","Data":"ebb2928c257df51e145427ae0d893777178f35d99337ecbc15ff9e82d8080de5"} Apr 16 22:13:36.459849 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:36.459819 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2grf8" event={"ID":"9614d8df-9bb5-4a22-a608-e18aa7fb1162","Type":"ContainerStarted","Data":"de96a7bddaf305ff50941abab4076445468e5afc85ca3250297b13fb3f42a0ba"} Apr 16 22:13:36.462381 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:36.462356 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" event={"ID":"6970b263-be92-460e-92da-a049f7bdbafe","Type":"ContainerStarted","Data":"8cb5ba6a2034e9aea2136ccdc3ad33002c40974fe864f3d932afd1829d3c3c43"} Apr 16 22:13:36.465788 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:36.465757 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" event={"ID":"3fc696b3-b6dd-4382-8367-ced58e1b1bdd","Type":"ContainerStarted","Data":"67a79475c3cf3d106b762a47ad8b08e03f34e71ccfc773f65d093e0c965a21ef"} Apr 16 22:13:36.468290 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:36.468235 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-191.ec2.internal" podStartSLOduration=2.468220606 podStartE2EDuration="2.468220606s" podCreationTimestamp="2026-04-16 22:13:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:13:36.468164994 +0000 UTC m=+3.710981057" watchObservedRunningTime="2026-04-16 22:13:36.468220606 +0000 UTC m=+3.711036677" Apr 16 22:13:36.470368 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:36.470340 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gssjh" event={"ID":"cc0a50b2-d73b-40da-a946-11e81bed8282","Type":"ContainerStarted","Data":"645780e3b90c3b47fc2f37755eb99ac40bebda1b36b61c2b567f65fa3d1f2cde"} Apr 16 22:13:36.474974 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:36.474808 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dm8zf" event={"ID":"d2e4bc53-aead-430d-aaf8-6def343926ef","Type":"ContainerStarted","Data":"a62a3b959a04b6e8264d49fdb09db472219c833dc756c8308482fd525076103f"} Apr 16 22:13:36.477732 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:36.477682 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" event={"ID":"32b9b798-09e2-4502-8f94-c5f194be68e3","Type":"ContainerStarted","Data":"afe74ce64997425c59d59f9860f07373a05a8754da16e80c617ba676254f14e6"} Apr 16 22:13:36.478957 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:36.478882 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jhsst" event={"ID":"0208e93a-b489-404a-8e48-d0d66d76793f","Type":"ContainerStarted","Data":"cf43ac075510422620f521966fa11605b234acc93fa74910d8e4dbf9f7faa77d"} Apr 16 22:13:36.480182 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:36.480156 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wjxwh" event={"ID":"6aaeb270-2bd7-4647-889b-36ff3ceba5cf","Type":"ContainerStarted","Data":"b65df7aa64799f34814f51b68df9dd45797bc73161cf9e068513481028f2ba84"} Apr 16 22:13:36.937545 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:36.936949 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs\") pod \"network-metrics-daemon-knqmk\" (UID: \"cc045530-7e0f-412e-98ba-915fe7aa6d22\") " pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:36.937545 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:36.937137 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:36.937545 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:36.937200 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs podName:cc045530-7e0f-412e-98ba-915fe7aa6d22 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:38.937181481 +0000 UTC m=+6.179997526 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs") pod "network-metrics-daemon-knqmk" (UID: "cc045530-7e0f-412e-98ba-915fe7aa6d22") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:37.038106 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:37.037471 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdkht\" (UniqueName: \"kubernetes.io/projected/06497fd8-f35d-4fd4-b42b-13ff6ded57e8-kube-api-access-mdkht\") pod \"network-check-target-4g7hv\" (UID: \"06497fd8-f35d-4fd4-b42b-13ff6ded57e8\") " pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:13:37.038106 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:37.037634 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:37.038106 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:37.037658 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:37.038106 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:37.037670 2574 projected.go:194] Error preparing data for projected volume kube-api-access-mdkht for pod openshift-network-diagnostics/network-check-target-4g7hv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:37.038106 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:37.037730 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/06497fd8-f35d-4fd4-b42b-13ff6ded57e8-kube-api-access-mdkht podName:06497fd8-f35d-4fd4-b42b-13ff6ded57e8 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:39.037711912 +0000 UTC m=+6.280527966 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-mdkht" (UniqueName: "kubernetes.io/projected/06497fd8-f35d-4fd4-b42b-13ff6ded57e8-kube-api-access-mdkht") pod "network-check-target-4g7hv" (UID: "06497fd8-f35d-4fd4-b42b-13ff6ded57e8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:37.446943 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:37.446844 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:37.447359 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:37.446997 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knqmk" podUID="cc045530-7e0f-412e-98ba-915fe7aa6d22" Apr 16 22:13:37.491975 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:37.491919 2574 generic.go:358] "Generic (PLEG): container finished" podID="7e9ac9a5ecee46125a37b1b7d1e8dc22" containerID="91dda9b87b721aa753f5971cd18a9e65a6b16391b5adbf6725d7fa4452de0e16" exitCode=0 Apr 16 22:13:37.492724 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:37.492490 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" event={"ID":"7e9ac9a5ecee46125a37b1b7d1e8dc22","Type":"ContainerDied","Data":"91dda9b87b721aa753f5971cd18a9e65a6b16391b5adbf6725d7fa4452de0e16"} Apr 16 22:13:38.445142 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:38.444471 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:13:38.445142 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:38.444617 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4g7hv" podUID="06497fd8-f35d-4fd4-b42b-13ff6ded57e8" Apr 16 22:13:38.508310 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:38.508274 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" event={"ID":"7e9ac9a5ecee46125a37b1b7d1e8dc22","Type":"ContainerStarted","Data":"e520fc6aa32b61f9c0f98e632ac98f75c99af5698d13052c41967583388de643"} Apr 16 22:13:38.953990 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:38.953948 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs\") pod \"network-metrics-daemon-knqmk\" (UID: \"cc045530-7e0f-412e-98ba-915fe7aa6d22\") " pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:38.954194 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:38.954126 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:38.954194 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:38.954192 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs podName:cc045530-7e0f-412e-98ba-915fe7aa6d22 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:42.954173359 +0000 UTC m=+10.196989422 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs") pod "network-metrics-daemon-knqmk" (UID: "cc045530-7e0f-412e-98ba-915fe7aa6d22") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:39.055159 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:39.055119 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdkht\" (UniqueName: \"kubernetes.io/projected/06497fd8-f35d-4fd4-b42b-13ff6ded57e8-kube-api-access-mdkht\") pod \"network-check-target-4g7hv\" (UID: \"06497fd8-f35d-4fd4-b42b-13ff6ded57e8\") " pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:13:39.055338 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:39.055256 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:39.055338 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:39.055274 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:39.055338 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:39.055284 2574 projected.go:194] Error preparing data for projected volume kube-api-access-mdkht for pod openshift-network-diagnostics/network-check-target-4g7hv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:39.055494 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:39.055342 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/06497fd8-f35d-4fd4-b42b-13ff6ded57e8-kube-api-access-mdkht podName:06497fd8-f35d-4fd4-b42b-13ff6ded57e8 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:43.05532364 +0000 UTC m=+10.298139698 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-mdkht" (UniqueName: "kubernetes.io/projected/06497fd8-f35d-4fd4-b42b-13ff6ded57e8-kube-api-access-mdkht") pod "network-check-target-4g7hv" (UID: "06497fd8-f35d-4fd4-b42b-13ff6ded57e8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:39.447267 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:39.446778 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:39.447267 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:39.446912 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knqmk" podUID="cc045530-7e0f-412e-98ba-915fe7aa6d22" Apr 16 22:13:40.444428 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:40.443902 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:13:40.444428 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:40.444056 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4g7hv" podUID="06497fd8-f35d-4fd4-b42b-13ff6ded57e8" Apr 16 22:13:41.445053 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:41.445003 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:41.445518 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:41.445176 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knqmk" podUID="cc045530-7e0f-412e-98ba-915fe7aa6d22" Apr 16 22:13:42.444768 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:42.444726 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:13:42.445017 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:42.444879 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4g7hv" podUID="06497fd8-f35d-4fd4-b42b-13ff6ded57e8" Apr 16 22:13:42.986682 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:42.986640 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs\") pod \"network-metrics-daemon-knqmk\" (UID: \"cc045530-7e0f-412e-98ba-915fe7aa6d22\") " pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:42.987188 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:42.986823 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:42.987188 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:42.986905 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs podName:cc045530-7e0f-412e-98ba-915fe7aa6d22 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:50.98688224 +0000 UTC m=+18.229698292 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs") pod "network-metrics-daemon-knqmk" (UID: "cc045530-7e0f-412e-98ba-915fe7aa6d22") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:43.088053 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:43.088008 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdkht\" (UniqueName: \"kubernetes.io/projected/06497fd8-f35d-4fd4-b42b-13ff6ded57e8-kube-api-access-mdkht\") pod \"network-check-target-4g7hv\" (UID: \"06497fd8-f35d-4fd4-b42b-13ff6ded57e8\") " pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:13:43.088237 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:43.088164 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:43.088237 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:43.088185 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:43.088237 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:43.088200 2574 projected.go:194] Error preparing data for projected volume kube-api-access-mdkht for pod openshift-network-diagnostics/network-check-target-4g7hv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:43.088397 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:43.088278 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/06497fd8-f35d-4fd4-b42b-13ff6ded57e8-kube-api-access-mdkht podName:06497fd8-f35d-4fd4-b42b-13ff6ded57e8 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:51.088257947 +0000 UTC m=+18.331073999 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-mdkht" (UniqueName: "kubernetes.io/projected/06497fd8-f35d-4fd4-b42b-13ff6ded57e8-kube-api-access-mdkht") pod "network-check-target-4g7hv" (UID: "06497fd8-f35d-4fd4-b42b-13ff6ded57e8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:43.445001 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:43.444739 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:43.445001 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:43.444914 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knqmk" podUID="cc045530-7e0f-412e-98ba-915fe7aa6d22" Apr 16 22:13:44.443901 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:44.443861 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:13:44.444429 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:44.444016 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4g7hv" podUID="06497fd8-f35d-4fd4-b42b-13ff6ded57e8" Apr 16 22:13:45.444644 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:45.444612 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:45.445132 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:45.444780 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knqmk" podUID="cc045530-7e0f-412e-98ba-915fe7aa6d22" Apr 16 22:13:46.444635 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:46.444595 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:13:46.444812 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:46.444734 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4g7hv" podUID="06497fd8-f35d-4fd4-b42b-13ff6ded57e8" Apr 16 22:13:47.245546 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:47.245484 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-191.ec2.internal" podStartSLOduration=13.245464794 podStartE2EDuration="13.245464794s" podCreationTimestamp="2026-04-16 22:13:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:13:38.523037535 +0000 UTC m=+5.765853607" watchObservedRunningTime="2026-04-16 22:13:47.245464794 +0000 UTC m=+14.488280911" Apr 16 22:13:47.245707 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:47.245606 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-dmn2w"] Apr 16 22:13:47.247516 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:47.247497 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:13:47.247620 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:47.247564 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmn2w" podUID="c08f4349-6022-4892-a46d-87843f55329d" Apr 16 22:13:47.322683 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:47.322652 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c08f4349-6022-4892-a46d-87843f55329d-dbus\") pod \"global-pull-secret-syncer-dmn2w\" (UID: \"c08f4349-6022-4892-a46d-87843f55329d\") " pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:13:47.322840 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:47.322736 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c08f4349-6022-4892-a46d-87843f55329d-kubelet-config\") pod \"global-pull-secret-syncer-dmn2w\" (UID: \"c08f4349-6022-4892-a46d-87843f55329d\") " pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:13:47.322840 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:47.322779 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c08f4349-6022-4892-a46d-87843f55329d-original-pull-secret\") pod \"global-pull-secret-syncer-dmn2w\" (UID: \"c08f4349-6022-4892-a46d-87843f55329d\") " pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:13:47.423407 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:47.423334 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c08f4349-6022-4892-a46d-87843f55329d-kubelet-config\") pod \"global-pull-secret-syncer-dmn2w\" (UID: \"c08f4349-6022-4892-a46d-87843f55329d\") " pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:13:47.423407 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:47.423383 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c08f4349-6022-4892-a46d-87843f55329d-original-pull-secret\") pod \"global-pull-secret-syncer-dmn2w\" (UID: \"c08f4349-6022-4892-a46d-87843f55329d\") " pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:13:47.423600 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:47.423415 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c08f4349-6022-4892-a46d-87843f55329d-dbus\") pod \"global-pull-secret-syncer-dmn2w\" (UID: \"c08f4349-6022-4892-a46d-87843f55329d\") " pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:13:47.423600 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:47.423452 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c08f4349-6022-4892-a46d-87843f55329d-kubelet-config\") pod \"global-pull-secret-syncer-dmn2w\" (UID: \"c08f4349-6022-4892-a46d-87843f55329d\") " pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:13:47.423600 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:47.423538 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c08f4349-6022-4892-a46d-87843f55329d-dbus\") pod \"global-pull-secret-syncer-dmn2w\" (UID: \"c08f4349-6022-4892-a46d-87843f55329d\") " pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:13:47.423600 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:47.423568 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:47.423798 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:47.423635 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c08f4349-6022-4892-a46d-87843f55329d-original-pull-secret podName:c08f4349-6022-4892-a46d-87843f55329d nodeName:}" failed. No retries permitted until 2026-04-16 22:13:47.923615491 +0000 UTC m=+15.166431533 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c08f4349-6022-4892-a46d-87843f55329d-original-pull-secret") pod "global-pull-secret-syncer-dmn2w" (UID: "c08f4349-6022-4892-a46d-87843f55329d") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:47.444056 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:47.444031 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:47.444215 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:47.444169 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knqmk" podUID="cc045530-7e0f-412e-98ba-915fe7aa6d22" Apr 16 22:13:47.925656 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:47.925622 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c08f4349-6022-4892-a46d-87843f55329d-original-pull-secret\") pod \"global-pull-secret-syncer-dmn2w\" (UID: \"c08f4349-6022-4892-a46d-87843f55329d\") " pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:13:47.926070 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:47.925771 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:47.926070 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:47.925838 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c08f4349-6022-4892-a46d-87843f55329d-original-pull-secret podName:c08f4349-6022-4892-a46d-87843f55329d nodeName:}" failed. No retries permitted until 2026-04-16 22:13:48.92581797 +0000 UTC m=+16.168634010 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c08f4349-6022-4892-a46d-87843f55329d-original-pull-secret") pod "global-pull-secret-syncer-dmn2w" (UID: "c08f4349-6022-4892-a46d-87843f55329d") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:48.444829 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:48.444797 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:13:48.445002 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:48.444919 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4g7hv" podUID="06497fd8-f35d-4fd4-b42b-13ff6ded57e8" Apr 16 22:13:48.932099 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:48.932060 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c08f4349-6022-4892-a46d-87843f55329d-original-pull-secret\") pod \"global-pull-secret-syncer-dmn2w\" (UID: \"c08f4349-6022-4892-a46d-87843f55329d\") " pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:13:48.932524 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:48.932171 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:48.932524 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:48.932241 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c08f4349-6022-4892-a46d-87843f55329d-original-pull-secret podName:c08f4349-6022-4892-a46d-87843f55329d nodeName:}" failed. No retries permitted until 2026-04-16 22:13:50.932222502 +0000 UTC m=+18.175038551 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c08f4349-6022-4892-a46d-87843f55329d-original-pull-secret") pod "global-pull-secret-syncer-dmn2w" (UID: "c08f4349-6022-4892-a46d-87843f55329d") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:49.444728 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:49.444694 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:49.444904 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:49.444749 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:13:49.444904 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:49.444842 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knqmk" podUID="cc045530-7e0f-412e-98ba-915fe7aa6d22" Apr 16 22:13:49.445051 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:49.444977 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmn2w" podUID="c08f4349-6022-4892-a46d-87843f55329d" Apr 16 22:13:50.444509 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:50.444473 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:13:50.444958 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:50.444614 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4g7hv" podUID="06497fd8-f35d-4fd4-b42b-13ff6ded57e8" Apr 16 22:13:50.948643 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:50.948601 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c08f4349-6022-4892-a46d-87843f55329d-original-pull-secret\") pod \"global-pull-secret-syncer-dmn2w\" (UID: \"c08f4349-6022-4892-a46d-87843f55329d\") " pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:13:50.948824 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:50.948736 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:50.948824 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:50.948803 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c08f4349-6022-4892-a46d-87843f55329d-original-pull-secret podName:c08f4349-6022-4892-a46d-87843f55329d nodeName:}" failed. No retries permitted until 2026-04-16 22:13:54.948787802 +0000 UTC m=+22.191603841 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c08f4349-6022-4892-a46d-87843f55329d-original-pull-secret") pod "global-pull-secret-syncer-dmn2w" (UID: "c08f4349-6022-4892-a46d-87843f55329d") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:51.049713 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:51.049672 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs\") pod \"network-metrics-daemon-knqmk\" (UID: \"cc045530-7e0f-412e-98ba-915fe7aa6d22\") " pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:51.049886 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:51.049853 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:51.049977 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:51.049927 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs podName:cc045530-7e0f-412e-98ba-915fe7aa6d22 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:07.049907013 +0000 UTC m=+34.292723065 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs") pod "network-metrics-daemon-knqmk" (UID: "cc045530-7e0f-412e-98ba-915fe7aa6d22") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:51.151168 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:51.151128 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdkht\" (UniqueName: \"kubernetes.io/projected/06497fd8-f35d-4fd4-b42b-13ff6ded57e8-kube-api-access-mdkht\") pod \"network-check-target-4g7hv\" (UID: \"06497fd8-f35d-4fd4-b42b-13ff6ded57e8\") " pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:13:51.151324 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:51.151270 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:51.151324 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:51.151294 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:51.151324 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:51.151305 2574 projected.go:194] Error preparing data for projected volume kube-api-access-mdkht for pod openshift-network-diagnostics/network-check-target-4g7hv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:51.151469 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:51.151373 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/06497fd8-f35d-4fd4-b42b-13ff6ded57e8-kube-api-access-mdkht podName:06497fd8-f35d-4fd4-b42b-13ff6ded57e8 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:07.151355762 +0000 UTC m=+34.394171802 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-mdkht" (UniqueName: "kubernetes.io/projected/06497fd8-f35d-4fd4-b42b-13ff6ded57e8-kube-api-access-mdkht") pod "network-check-target-4g7hv" (UID: "06497fd8-f35d-4fd4-b42b-13ff6ded57e8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:51.443913 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:51.443879 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:51.444103 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:51.443940 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:13:51.444103 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:51.444047 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knqmk" podUID="cc045530-7e0f-412e-98ba-915fe7aa6d22" Apr 16 22:13:51.444220 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:51.444182 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmn2w" podUID="c08f4349-6022-4892-a46d-87843f55329d" Apr 16 22:13:52.444199 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:52.444163 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:13:52.444598 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:52.444316 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4g7hv" podUID="06497fd8-f35d-4fd4-b42b-13ff6ded57e8" Apr 16 22:13:53.445692 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:53.445423 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:53.446267 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:53.445484 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:13:53.446267 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:53.445766 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knqmk" podUID="cc045530-7e0f-412e-98ba-915fe7aa6d22" Apr 16 22:13:53.446267 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:53.445827 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmn2w" podUID="c08f4349-6022-4892-a46d-87843f55329d" Apr 16 22:13:53.535002 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:53.534973 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dm8zf" event={"ID":"d2e4bc53-aead-430d-aaf8-6def343926ef","Type":"ContainerStarted","Data":"55ca37c487e2892d828a4254611e1202d4df1e7ac42f5c17b9c91d3cc0f117f0"} Apr 16 22:13:53.537273 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:53.537240 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" event={"ID":"32b9b798-09e2-4502-8f94-c5f194be68e3","Type":"ContainerStarted","Data":"e7c5b575aaedf79555c11b6c7a0c5c79307f3f68c1bbc0d27b4629569a5d2373"} Apr 16 22:13:53.540098 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:53.540073 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jhsst" event={"ID":"0208e93a-b489-404a-8e48-d0d66d76793f","Type":"ContainerStarted","Data":"9f12179341cfb49037206a4707c3bddedfeb428eca81d2c2adb61e81f1e2425d"} Apr 16 22:13:53.541423 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:53.541402 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wjxwh" event={"ID":"6aaeb270-2bd7-4647-889b-36ff3ceba5cf","Type":"ContainerStarted","Data":"049cb14581d776d518c8ff6b7e2f777692ae1b54573217cdf8a0224a3bb580ce"} Apr 16 22:13:53.542729 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:53.542703 2574 generic.go:358] "Generic (PLEG): container finished" podID="29ce4801-ff31-4651-98b4-aba09699b7b6" containerID="0f465f9e488f3d6f8a1208b1ba675e8991dd72558503fcfbc7a9709651f46129" exitCode=0 Apr 16 22:13:53.542823 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:53.542769 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5pd4l" event={"ID":"29ce4801-ff31-4651-98b4-aba09699b7b6","Type":"ContainerDied","Data":"0f465f9e488f3d6f8a1208b1ba675e8991dd72558503fcfbc7a9709651f46129"} Apr 16 22:13:53.544058 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:53.544032 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2grf8" event={"ID":"9614d8df-9bb5-4a22-a608-e18aa7fb1162","Type":"ContainerStarted","Data":"e28f1d5135f13859ab0dec6fa235c05adc2c43b5fdc47e14795c900236f494dd"} Apr 16 22:13:53.546055 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:53.546033 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 22:13:53.546359 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:53.546333 2574 generic.go:358] "Generic (PLEG): container finished" podID="6970b263-be92-460e-92da-a049f7bdbafe" containerID="e857f4541b8fe6005efdf6b61b4a6bc5e87f7fc43d2dfe5cd70a793e0affc6fb" exitCode=1 Apr 16 22:13:53.546441 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:53.546392 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" event={"ID":"6970b263-be92-460e-92da-a049f7bdbafe","Type":"ContainerStarted","Data":"5ed5b519f58c464bf702b0d09b641205b67e06ed4baa71eec29ca6c7cf515aba"} Apr 16 22:13:53.546441 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:53.546418 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" event={"ID":"6970b263-be92-460e-92da-a049f7bdbafe","Type":"ContainerStarted","Data":"d9bd5136c27a5e3fbc4941f83c106a6b753eaa0536a61a7e42e37b7ee9a9abe3"} Apr 16 22:13:53.546441 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:53.546433 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" event={"ID":"6970b263-be92-460e-92da-a049f7bdbafe","Type":"ContainerDied","Data":"e857f4541b8fe6005efdf6b61b4a6bc5e87f7fc43d2dfe5cd70a793e0affc6fb"} Apr 16 22:13:53.546566 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:53.546449 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" event={"ID":"6970b263-be92-460e-92da-a049f7bdbafe","Type":"ContainerStarted","Data":"13ae4f90f09e31fa35d51c72d5c91553aa172b8b0fb29aa810a60e10824cbabe"} Apr 16 22:13:53.547887 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:53.547849 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" event={"ID":"3fc696b3-b6dd-4382-8367-ced58e1b1bdd","Type":"ContainerStarted","Data":"891859055ba90a255eb17dfa7acbe930a8e863b2b323efb8348d8a16e497525a"} Apr 16 22:13:53.548548 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:53.548505 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-dm8zf" podStartSLOduration=3.868798886 podStartE2EDuration="20.548491498s" podCreationTimestamp="2026-04-16 22:13:33 +0000 UTC" firstStartedPulling="2026-04-16 22:13:36.110884813 +0000 UTC m=+3.353700853" lastFinishedPulling="2026-04-16 22:13:52.790577416 +0000 UTC m=+20.033393465" observedRunningTime="2026-04-16 22:13:53.548093437 +0000 UTC m=+20.790909499" watchObservedRunningTime="2026-04-16 22:13:53.548491498 +0000 UTC m=+20.791307560" Apr 16 22:13:53.574541 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:53.574490 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2grf8" podStartSLOduration=3.904270291 podStartE2EDuration="20.574477488s" podCreationTimestamp="2026-04-16 22:13:33 +0000 UTC" firstStartedPulling="2026-04-16 22:13:36.114484717 +0000 UTC m=+3.357300758" lastFinishedPulling="2026-04-16 22:13:52.784691904 +0000 UTC m=+20.027507955" observedRunningTime="2026-04-16 22:13:53.573958983 +0000 UTC m=+20.816775046" watchObservedRunningTime="2026-04-16 22:13:53.574477488 +0000 UTC m=+20.817293552" Apr 16 22:13:53.574892 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:53.574852 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-jhsst" podStartSLOduration=3.9634746720000003 podStartE2EDuration="20.574840948s" podCreationTimestamp="2026-04-16 22:13:33 +0000 UTC" firstStartedPulling="2026-04-16 22:13:36.108709883 +0000 UTC m=+3.351525923" lastFinishedPulling="2026-04-16 22:13:52.720076156 +0000 UTC m=+19.962892199" observedRunningTime="2026-04-16 22:13:53.560491551 +0000 UTC m=+20.803307613" watchObservedRunningTime="2026-04-16 22:13:53.574840948 +0000 UTC m=+20.817657009" Apr 16 22:13:53.590713 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:53.590679 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wjxwh" podStartSLOduration=3.887850999 podStartE2EDuration="20.590665331s" podCreationTimestamp="2026-04-16 22:13:33 +0000 UTC" firstStartedPulling="2026-04-16 22:13:36.101248615 +0000 UTC m=+3.344064654" lastFinishedPulling="2026-04-16 22:13:52.804062938 +0000 UTC m=+20.046878986" observedRunningTime="2026-04-16 22:13:53.590151373 +0000 UTC m=+20.832967434" watchObservedRunningTime="2026-04-16 22:13:53.590665331 +0000 UTC m=+20.833481394" Apr 16 22:13:53.625361 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:53.625312 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-lhbpr" podStartSLOduration=3.9486509979999997 podStartE2EDuration="20.625295263s" podCreationTimestamp="2026-04-16 22:13:33 +0000 UTC" firstStartedPulling="2026-04-16 22:13:36.109621449 +0000 UTC m=+3.352437489" lastFinishedPulling="2026-04-16 22:13:52.786265703 +0000 UTC m=+20.029081754" observedRunningTime="2026-04-16 22:13:53.624469943 +0000 UTC m=+20.867286005" watchObservedRunningTime="2026-04-16 22:13:53.625295263 +0000 UTC m=+20.868111323" Apr 16 22:13:54.444717 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:54.444688 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:13:54.444852 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:54.444818 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4g7hv" podUID="06497fd8-f35d-4fd4-b42b-13ff6ded57e8" Apr 16 22:13:54.457148 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:54.457108 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 22:13:54.552758 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:54.552729 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 22:13:54.553200 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:54.553171 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" event={"ID":"6970b263-be92-460e-92da-a049f7bdbafe","Type":"ContainerStarted","Data":"d017126d9331909b982c474dfa56ab9acc4997772f705d8f0fc5352e0d9a586d"} Apr 16 22:13:54.553312 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:54.553210 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" event={"ID":"6970b263-be92-460e-92da-a049f7bdbafe","Type":"ContainerStarted","Data":"62961b757252c3845562fbb05a57f87a7310c2c8a2ff88453788aa7f1e38e6ae"} Apr 16 22:13:54.554908 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:54.554880 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" event={"ID":"3fc696b3-b6dd-4382-8367-ced58e1b1bdd","Type":"ContainerStarted","Data":"7394329ece4bd211c712107a61c2787c961d0a5a42b19aa69a987f1d16df2a28"} Apr 16 22:13:54.556305 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:54.556272 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gssjh" event={"ID":"cc0a50b2-d73b-40da-a946-11e81bed8282","Type":"ContainerStarted","Data":"8fac7914688d717c827009976c9b36ab913a90d6143ea2df7b18075b83557422"} Apr 16 22:13:54.571938 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:54.571888 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-gssjh" podStartSLOduration=4.89947697 podStartE2EDuration="21.571871868s" podCreationTimestamp="2026-04-16 22:13:33 +0000 UTC" firstStartedPulling="2026-04-16 22:13:36.112286744 +0000 UTC m=+3.355102794" lastFinishedPulling="2026-04-16 22:13:52.784681636 +0000 UTC m=+20.027497692" observedRunningTime="2026-04-16 22:13:54.57169442 +0000 UTC m=+21.814510475" watchObservedRunningTime="2026-04-16 22:13:54.571871868 +0000 UTC m=+21.814687932" Apr 16 22:13:54.982855 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:54.982813 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c08f4349-6022-4892-a46d-87843f55329d-original-pull-secret\") pod \"global-pull-secret-syncer-dmn2w\" (UID: \"c08f4349-6022-4892-a46d-87843f55329d\") " pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:13:54.983065 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:54.982996 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:54.983128 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:54.983108 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c08f4349-6022-4892-a46d-87843f55329d-original-pull-secret podName:c08f4349-6022-4892-a46d-87843f55329d nodeName:}" failed. No retries permitted until 2026-04-16 22:14:02.98308397 +0000 UTC m=+30.225900023 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c08f4349-6022-4892-a46d-87843f55329d-original-pull-secret") pod "global-pull-secret-syncer-dmn2w" (UID: "c08f4349-6022-4892-a46d-87843f55329d") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:55.384362 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:55.384230 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T22:13:54.457129144Z","UUID":"d753c228-7c9f-41fd-aaab-c862ad431bd4","Handler":null,"Name":"","Endpoint":""} Apr 16 22:13:55.387795 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:55.387769 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 22:13:55.387795 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:55.387799 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 22:13:55.444039 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:55.444008 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:55.444201 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:55.444144 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knqmk" podUID="cc045530-7e0f-412e-98ba-915fe7aa6d22" Apr 16 22:13:55.444201 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:55.444156 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:13:55.444313 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:55.444278 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmn2w" podUID="c08f4349-6022-4892-a46d-87843f55329d" Apr 16 22:13:56.444541 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:56.444361 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:13:56.445028 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:56.444634 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4g7hv" podUID="06497fd8-f35d-4fd4-b42b-13ff6ded57e8" Apr 16 22:13:56.563268 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:56.563239 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 22:13:56.563629 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:56.563589 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" event={"ID":"6970b263-be92-460e-92da-a049f7bdbafe","Type":"ContainerStarted","Data":"d052a28de1d428cfca369440cc13cb874ac2dc866a53bdd7f32bfdeba36ca155"} Apr 16 22:13:56.565463 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:56.565434 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" event={"ID":"3fc696b3-b6dd-4382-8367-ced58e1b1bdd","Type":"ContainerStarted","Data":"6929c2431b0d961763ae35834f179dd2421fd5348094cab3c2dd86da75bf7944"} Apr 16 22:13:56.584570 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:56.584492 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5pbv6" podStartSLOduration=4.15575863 podStartE2EDuration="23.584474996s" podCreationTimestamp="2026-04-16 22:13:33 +0000 UTC" firstStartedPulling="2026-04-16 22:13:36.104599721 +0000 UTC m=+3.347415775" lastFinishedPulling="2026-04-16 22:13:55.5333161 +0000 UTC m=+22.776132141" observedRunningTime="2026-04-16 22:13:56.583033272 +0000 UTC m=+23.825849324" watchObservedRunningTime="2026-04-16 22:13:56.584474996 +0000 UTC m=+23.827291069" Apr 16 22:13:57.191121 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:57.191086 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-jhsst" Apr 16 22:13:57.191764 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:57.191723 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-jhsst" Apr 16 22:13:57.444175 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:57.444103 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:57.444336 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:57.444116 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:13:57.444336 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:57.444209 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knqmk" podUID="cc045530-7e0f-412e-98ba-915fe7aa6d22" Apr 16 22:13:57.444336 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:57.444298 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmn2w" podUID="c08f4349-6022-4892-a46d-87843f55329d" Apr 16 22:13:57.567565 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:57.567502 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-jhsst" Apr 16 22:13:57.568166 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:57.567871 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-jhsst" Apr 16 22:13:58.444910 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:58.444737 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:13:58.445085 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:58.445003 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4g7hv" podUID="06497fd8-f35d-4fd4-b42b-13ff6ded57e8" Apr 16 22:13:58.570888 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:58.570860 2574 generic.go:358] "Generic (PLEG): container finished" podID="29ce4801-ff31-4651-98b4-aba09699b7b6" containerID="03440ce867f56eb7bec808a3b52796a08f924f8fcbe36138564fac9549248cac" exitCode=0 Apr 16 22:13:58.571446 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:58.570952 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5pd4l" event={"ID":"29ce4801-ff31-4651-98b4-aba09699b7b6","Type":"ContainerDied","Data":"03440ce867f56eb7bec808a3b52796a08f924f8fcbe36138564fac9549248cac"} Apr 16 22:13:58.573987 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:58.573971 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 22:13:58.574331 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:58.574298 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" event={"ID":"6970b263-be92-460e-92da-a049f7bdbafe","Type":"ContainerStarted","Data":"742b74ab928e59a7c358660157343e871d935214ae0fd186972028e7993a6d43"} Apr 16 22:13:58.574737 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:58.574719 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:58.574809 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:58.574744 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:58.574859 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:58.574842 2574 scope.go:117] "RemoveContainer" containerID="e857f4541b8fe6005efdf6b61b4a6bc5e87f7fc43d2dfe5cd70a793e0affc6fb" Apr 16 22:13:58.589913 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:58.589896 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:58.593362 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:58.593345 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:13:59.444542 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:59.444512 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:13:59.444698 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:59.444626 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knqmk" podUID="cc045530-7e0f-412e-98ba-915fe7aa6d22" Apr 16 22:13:59.444698 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:59.444684 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:13:59.444829 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:13:59.444805 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmn2w" podUID="c08f4349-6022-4892-a46d-87843f55329d" Apr 16 22:13:59.579197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:59.579171 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 22:13:59.579675 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:59.579521 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" event={"ID":"6970b263-be92-460e-92da-a049f7bdbafe","Type":"ContainerStarted","Data":"f3044460893b9587349df81c2eba35a92b11dd2e209bd55d11a155013adf91b2"} Apr 16 22:13:59.579728 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:59.579692 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 22:13:59.610157 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:13:59.610107 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" podStartSLOduration=9.677869324 podStartE2EDuration="26.610088897s" podCreationTimestamp="2026-04-16 22:13:33 +0000 UTC" firstStartedPulling="2026-04-16 22:13:36.11357963 +0000 UTC m=+3.356395668" lastFinishedPulling="2026-04-16 22:13:53.045799199 +0000 UTC m=+20.288615241" observedRunningTime="2026-04-16 22:13:59.6095651 +0000 UTC m=+26.852381161" watchObservedRunningTime="2026-04-16 22:13:59.610088897 +0000 UTC m=+26.852904985" Apr 16 22:14:00.163814 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:00.163786 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4g7hv"] Apr 16 22:14:00.163979 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:00.163899 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:14:00.164043 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:00.164023 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4g7hv" podUID="06497fd8-f35d-4fd4-b42b-13ff6ded57e8" Apr 16 22:14:00.167578 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:00.167549 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dmn2w"] Apr 16 22:14:00.167682 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:00.167622 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:14:00.167721 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:00.167692 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmn2w" podUID="c08f4349-6022-4892-a46d-87843f55329d" Apr 16 22:14:00.182580 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:00.182551 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-knqmk"] Apr 16 22:14:00.182701 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:00.182650 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:14:00.182822 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:00.182758 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knqmk" podUID="cc045530-7e0f-412e-98ba-915fe7aa6d22" Apr 16 22:14:00.583028 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:00.582992 2574 generic.go:358] "Generic (PLEG): container finished" podID="29ce4801-ff31-4651-98b4-aba09699b7b6" containerID="82f77375a162728a4803132d803cfb0b339ba1093a78a5bb4dc153f18562236e" exitCode=0 Apr 16 22:14:00.583429 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:00.583084 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5pd4l" event={"ID":"29ce4801-ff31-4651-98b4-aba09699b7b6","Type":"ContainerDied","Data":"82f77375a162728a4803132d803cfb0b339ba1093a78a5bb4dc153f18562236e"} Apr 16 22:14:00.583429 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:00.583283 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 22:14:02.444321 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:02.444159 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:14:02.444777 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:02.444163 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:14:02.444777 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:02.444413 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmn2w" podUID="c08f4349-6022-4892-a46d-87843f55329d" Apr 16 22:14:02.444777 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:02.444163 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:14:02.444777 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:02.444477 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4g7hv" podUID="06497fd8-f35d-4fd4-b42b-13ff6ded57e8" Apr 16 22:14:02.444777 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:02.444553 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knqmk" podUID="cc045530-7e0f-412e-98ba-915fe7aa6d22" Apr 16 22:14:02.589292 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:02.589258 2574 generic.go:358] "Generic (PLEG): container finished" podID="29ce4801-ff31-4651-98b4-aba09699b7b6" containerID="87bc7c896619c1d3bcc195f85ab8c52b7d9baeaf107af474cc08e520805010ca" exitCode=0 Apr 16 22:14:02.589426 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:02.589334 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5pd4l" event={"ID":"29ce4801-ff31-4651-98b4-aba09699b7b6","Type":"ContainerDied","Data":"87bc7c896619c1d3bcc195f85ab8c52b7d9baeaf107af474cc08e520805010ca"} Apr 16 22:14:03.046839 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:03.046805 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c08f4349-6022-4892-a46d-87843f55329d-original-pull-secret\") pod \"global-pull-secret-syncer-dmn2w\" (UID: \"c08f4349-6022-4892-a46d-87843f55329d\") " pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:14:03.047029 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:03.046981 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:14:03.047109 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:03.047059 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c08f4349-6022-4892-a46d-87843f55329d-original-pull-secret podName:c08f4349-6022-4892-a46d-87843f55329d nodeName:}" failed. No retries permitted until 2026-04-16 22:14:19.047038488 +0000 UTC m=+46.289854541 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c08f4349-6022-4892-a46d-87843f55329d-original-pull-secret") pod "global-pull-secret-syncer-dmn2w" (UID: "c08f4349-6022-4892-a46d-87843f55329d") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:14:03.540990 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:03.540952 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:14:03.541563 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:03.541243 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 22:14:03.558537 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:03.558480 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" podUID="6970b263-be92-460e-92da-a049f7bdbafe" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 22:14:03.568779 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:03.568741 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" podUID="6970b263-be92-460e-92da-a049f7bdbafe" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 22:14:04.444511 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:04.444470 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:14:04.444694 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:04.444518 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:14:04.444694 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:04.444597 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4g7hv" podUID="06497fd8-f35d-4fd4-b42b-13ff6ded57e8" Apr 16 22:14:04.444694 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:04.444628 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmn2w" podUID="c08f4349-6022-4892-a46d-87843f55329d" Apr 16 22:14:04.444694 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:04.444664 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:14:04.444918 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:04.444780 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knqmk" podUID="cc045530-7e0f-412e-98ba-915fe7aa6d22" Apr 16 22:14:06.093320 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.093277 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-191.ec2.internal" event="NodeReady" Apr 16 22:14:06.093789 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.093449 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 22:14:06.139950 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.139901 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8qb8j"] Apr 16 22:14:06.166002 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.165901 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-k6prk"] Apr 16 22:14:06.166186 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.166080 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8qb8j" Apr 16 22:14:06.168788 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.168595 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 22:14:06.168788 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.168655 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 22:14:06.168788 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.168671 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-b4hjg\"" Apr 16 22:14:06.168788 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.168659 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 22:14:06.180330 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.180304 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8qb8j"] Apr 16 22:14:06.180330 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.180338 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-k6prk"] Apr 16 22:14:06.180511 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.180469 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k6prk" Apr 16 22:14:06.183271 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.183101 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 22:14:06.183423 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.183362 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 22:14:06.183534 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.183362 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-66k94\"" Apr 16 22:14:06.270670 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.270620 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert\") pod \"ingress-canary-8qb8j\" (UID: \"92752f12-a0e9-4d82-90c5-3b5beed10ab8\") " pod="openshift-ingress-canary/ingress-canary-8qb8j" Apr 16 22:14:06.270670 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.270675 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls\") pod \"dns-default-k6prk\" (UID: \"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0\") " pod="openshift-dns/dns-default-k6prk" Apr 16 22:14:06.270920 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.270762 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khskn\" (UniqueName: \"kubernetes.io/projected/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-kube-api-access-khskn\") pod \"dns-default-k6prk\" (UID: \"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0\") " pod="openshift-dns/dns-default-k6prk" Apr 16 22:14:06.270920 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.270821 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-tmp-dir\") pod \"dns-default-k6prk\" (UID: \"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0\") " pod="openshift-dns/dns-default-k6prk" Apr 16 22:14:06.270920 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.270865 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhf7j\" (UniqueName: \"kubernetes.io/projected/92752f12-a0e9-4d82-90c5-3b5beed10ab8-kube-api-access-zhf7j\") pod \"ingress-canary-8qb8j\" (UID: \"92752f12-a0e9-4d82-90c5-3b5beed10ab8\") " pod="openshift-ingress-canary/ingress-canary-8qb8j" Apr 16 22:14:06.270920 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.270892 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-config-volume\") pod \"dns-default-k6prk\" (UID: \"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0\") " pod="openshift-dns/dns-default-k6prk" Apr 16 22:14:06.371707 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.371675 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-tmp-dir\") pod \"dns-default-k6prk\" (UID: \"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0\") " pod="openshift-dns/dns-default-k6prk" Apr 16 22:14:06.371978 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.371744 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhf7j\" (UniqueName: \"kubernetes.io/projected/92752f12-a0e9-4d82-90c5-3b5beed10ab8-kube-api-access-zhf7j\") pod \"ingress-canary-8qb8j\" (UID: \"92752f12-a0e9-4d82-90c5-3b5beed10ab8\") " pod="openshift-ingress-canary/ingress-canary-8qb8j" Apr 16 22:14:06.371978 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.371768 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-config-volume\") pod \"dns-default-k6prk\" (UID: \"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0\") " pod="openshift-dns/dns-default-k6prk" Apr 16 22:14:06.371978 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.371836 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert\") pod \"ingress-canary-8qb8j\" (UID: \"92752f12-a0e9-4d82-90c5-3b5beed10ab8\") " pod="openshift-ingress-canary/ingress-canary-8qb8j" Apr 16 22:14:06.371978 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.371855 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls\") pod \"dns-default-k6prk\" (UID: \"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0\") " pod="openshift-dns/dns-default-k6prk" Apr 16 22:14:06.372215 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:06.372027 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:06.372215 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:06.372045 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:06.372215 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:06.372088 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls podName:cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:06.87206765 +0000 UTC m=+34.114883693 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls") pod "dns-default-k6prk" (UID: "cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0") : secret "dns-default-metrics-tls" not found Apr 16 22:14:06.372215 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:06.372108 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert podName:92752f12-a0e9-4d82-90c5-3b5beed10ab8 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:06.872098446 +0000 UTC m=+34.114914492 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert") pod "ingress-canary-8qb8j" (UID: "92752f12-a0e9-4d82-90c5-3b5beed10ab8") : secret "canary-serving-cert" not found Apr 16 22:14:06.372215 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.372132 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khskn\" (UniqueName: \"kubernetes.io/projected/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-kube-api-access-khskn\") pod \"dns-default-k6prk\" (UID: \"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0\") " pod="openshift-dns/dns-default-k6prk" Apr 16 22:14:06.372451 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.372431 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-config-volume\") pod \"dns-default-k6prk\" (UID: \"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0\") " pod="openshift-dns/dns-default-k6prk" Apr 16 22:14:06.372500 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.372445 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-tmp-dir\") pod \"dns-default-k6prk\" (UID: \"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0\") " pod="openshift-dns/dns-default-k6prk" Apr 16 22:14:06.382262 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.382237 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khskn\" (UniqueName: \"kubernetes.io/projected/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-kube-api-access-khskn\") pod \"dns-default-k6prk\" (UID: \"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0\") " pod="openshift-dns/dns-default-k6prk" Apr 16 22:14:06.382384 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.382300 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhf7j\" (UniqueName: \"kubernetes.io/projected/92752f12-a0e9-4d82-90c5-3b5beed10ab8-kube-api-access-zhf7j\") pod \"ingress-canary-8qb8j\" (UID: \"92752f12-a0e9-4d82-90c5-3b5beed10ab8\") " pod="openshift-ingress-canary/ingress-canary-8qb8j" Apr 16 22:14:06.444010 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.443969 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:14:06.444010 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.443990 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:14:06.444373 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.444346 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:14:06.446537 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.446513 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:14:06.446659 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.446550 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-vqjpn\"" Apr 16 22:14:06.446802 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.446771 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zhbqt\"" Apr 16 22:14:06.446914 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.446839 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 22:14:06.446914 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.446889 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:14:06.447125 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.447107 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:14:06.876201 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.876166 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert\") pod \"ingress-canary-8qb8j\" (UID: \"92752f12-a0e9-4d82-90c5-3b5beed10ab8\") " pod="openshift-ingress-canary/ingress-canary-8qb8j" Apr 16 22:14:06.876422 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:06.876213 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls\") pod \"dns-default-k6prk\" (UID: \"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0\") " pod="openshift-dns/dns-default-k6prk" Apr 16 22:14:06.876422 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:06.876294 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:06.876422 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:06.876356 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert podName:92752f12-a0e9-4d82-90c5-3b5beed10ab8 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:07.876342211 +0000 UTC m=+35.119158254 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert") pod "ingress-canary-8qb8j" (UID: "92752f12-a0e9-4d82-90c5-3b5beed10ab8") : secret "canary-serving-cert" not found Apr 16 22:14:06.876422 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:06.876295 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:06.876650 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:06.876475 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls podName:cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:07.876447685 +0000 UTC m=+35.119263734 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls") pod "dns-default-k6prk" (UID: "cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0") : secret "dns-default-metrics-tls" not found Apr 16 22:14:07.078148 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:07.078110 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs\") pod \"network-metrics-daemon-knqmk\" (UID: \"cc045530-7e0f-412e-98ba-915fe7aa6d22\") " pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:14:07.078344 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:07.078254 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:14:07.078344 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:07.078318 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs podName:cc045530-7e0f-412e-98ba-915fe7aa6d22 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:39.078304094 +0000 UTC m=+66.321120138 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs") pod "network-metrics-daemon-knqmk" (UID: "cc045530-7e0f-412e-98ba-915fe7aa6d22") : secret "metrics-daemon-secret" not found Apr 16 22:14:07.178535 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:07.178459 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdkht\" (UniqueName: \"kubernetes.io/projected/06497fd8-f35d-4fd4-b42b-13ff6ded57e8-kube-api-access-mdkht\") pod \"network-check-target-4g7hv\" (UID: \"06497fd8-f35d-4fd4-b42b-13ff6ded57e8\") " pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:14:07.181381 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:07.181360 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdkht\" (UniqueName: \"kubernetes.io/projected/06497fd8-f35d-4fd4-b42b-13ff6ded57e8-kube-api-access-mdkht\") pod \"network-check-target-4g7hv\" (UID: \"06497fd8-f35d-4fd4-b42b-13ff6ded57e8\") " pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:14:07.364650 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:07.364611 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:14:07.885358 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:07.885310 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert\") pod \"ingress-canary-8qb8j\" (UID: \"92752f12-a0e9-4d82-90c5-3b5beed10ab8\") " pod="openshift-ingress-canary/ingress-canary-8qb8j" Apr 16 22:14:07.885655 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:07.885366 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls\") pod \"dns-default-k6prk\" (UID: \"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0\") " pod="openshift-dns/dns-default-k6prk" Apr 16 22:14:07.885655 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:07.885483 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:07.885655 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:07.885541 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:07.885655 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:07.885559 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert podName:92752f12-a0e9-4d82-90c5-3b5beed10ab8 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:09.885539261 +0000 UTC m=+37.128355301 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert") pod "ingress-canary-8qb8j" (UID: "92752f12-a0e9-4d82-90c5-3b5beed10ab8") : secret "canary-serving-cert" not found Apr 16 22:14:07.885655 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:07.885605 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls podName:cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:09.885587791 +0000 UTC m=+37.128403844 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls") pod "dns-default-k6prk" (UID: "cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0") : secret "dns-default-metrics-tls" not found Apr 16 22:14:08.375515 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:08.375338 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4g7hv"] Apr 16 22:14:08.463836 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:14:08.463793 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06497fd8_f35d_4fd4_b42b_13ff6ded57e8.slice/crio-be01f604dc3b643757432dd4cc43bf4d92b7bd36dd028a59a905ff2b0a13782f WatchSource:0}: Error finding container be01f604dc3b643757432dd4cc43bf4d92b7bd36dd028a59a905ff2b0a13782f: Status 404 returned error can't find the container with id be01f604dc3b643757432dd4cc43bf4d92b7bd36dd028a59a905ff2b0a13782f Apr 16 22:14:08.601251 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:08.601213 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4g7hv" event={"ID":"06497fd8-f35d-4fd4-b42b-13ff6ded57e8","Type":"ContainerStarted","Data":"be01f604dc3b643757432dd4cc43bf4d92b7bd36dd028a59a905ff2b0a13782f"} Apr 16 22:14:09.605825 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:09.605785 2574 generic.go:358] "Generic (PLEG): container finished" podID="29ce4801-ff31-4651-98b4-aba09699b7b6" containerID="4c092df2c65f0fe5f6929c6c97a3822a7c701a6e3a9ba2e552adf6ac754d924a" exitCode=0 Apr 16 22:14:09.606216 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:09.605854 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5pd4l" event={"ID":"29ce4801-ff31-4651-98b4-aba09699b7b6","Type":"ContainerDied","Data":"4c092df2c65f0fe5f6929c6c97a3822a7c701a6e3a9ba2e552adf6ac754d924a"} Apr 16 22:14:09.904408 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:09.904372 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert\") pod \"ingress-canary-8qb8j\" (UID: \"92752f12-a0e9-4d82-90c5-3b5beed10ab8\") " pod="openshift-ingress-canary/ingress-canary-8qb8j" Apr 16 22:14:09.904408 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:09.904413 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls\") pod \"dns-default-k6prk\" (UID: \"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0\") " pod="openshift-dns/dns-default-k6prk" Apr 16 22:14:09.904642 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:09.904551 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:09.904642 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:09.904605 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls podName:cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:13.904591873 +0000 UTC m=+41.147407915 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls") pod "dns-default-k6prk" (UID: "cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0") : secret "dns-default-metrics-tls" not found Apr 16 22:14:09.904642 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:09.904550 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:09.904806 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:09.904705 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert podName:92752f12-a0e9-4d82-90c5-3b5beed10ab8 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:13.904684423 +0000 UTC m=+41.147500479 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert") pod "ingress-canary-8qb8j" (UID: "92752f12-a0e9-4d82-90c5-3b5beed10ab8") : secret "canary-serving-cert" not found Apr 16 22:14:10.611292 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:10.611259 2574 generic.go:358] "Generic (PLEG): container finished" podID="29ce4801-ff31-4651-98b4-aba09699b7b6" containerID="11cd273cdcbeeff227f70da1a8ce2cf245f9eafc16680654fb9c82c191f54d2f" exitCode=0 Apr 16 22:14:10.611718 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:10.611336 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5pd4l" event={"ID":"29ce4801-ff31-4651-98b4-aba09699b7b6","Type":"ContainerDied","Data":"11cd273cdcbeeff227f70da1a8ce2cf245f9eafc16680654fb9c82c191f54d2f"} Apr 16 22:14:11.615914 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:11.615700 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5pd4l" event={"ID":"29ce4801-ff31-4651-98b4-aba09699b7b6","Type":"ContainerStarted","Data":"a0e7e51694bedfb5cfa8408e076341eace16e848eb758e4eb1e2f679b8b5dc52"} Apr 16 22:14:11.616959 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:11.616921 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4g7hv" event={"ID":"06497fd8-f35d-4fd4-b42b-13ff6ded57e8","Type":"ContainerStarted","Data":"3d7e632a006c501000ee58ee8024020271abe11cee297bb186cf360c5b77dc42"} Apr 16 22:14:11.617078 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:11.617062 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:14:11.638331 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:11.638286 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5pd4l" podStartSLOduration=6.253888314 podStartE2EDuration="38.638272215s" podCreationTimestamp="2026-04-16 22:13:33 +0000 UTC" firstStartedPulling="2026-04-16 22:13:36.11548035 +0000 UTC m=+3.358296397" lastFinishedPulling="2026-04-16 22:14:08.499864256 +0000 UTC m=+35.742680298" observedRunningTime="2026-04-16 22:14:11.636845337 +0000 UTC m=+38.879661396" watchObservedRunningTime="2026-04-16 22:14:11.638272215 +0000 UTC m=+38.881088275" Apr 16 22:14:11.651267 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:11.651212 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-4g7hv" podStartSLOduration=35.910175325 podStartE2EDuration="38.651196297s" podCreationTimestamp="2026-04-16 22:13:33 +0000 UTC" firstStartedPulling="2026-04-16 22:14:08.47875461 +0000 UTC m=+35.721570649" lastFinishedPulling="2026-04-16 22:14:11.219775573 +0000 UTC m=+38.462591621" observedRunningTime="2026-04-16 22:14:11.650749191 +0000 UTC m=+38.893565251" watchObservedRunningTime="2026-04-16 22:14:11.651196297 +0000 UTC m=+38.894012358" Apr 16 22:14:13.934003 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:13.933965 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert\") pod \"ingress-canary-8qb8j\" (UID: \"92752f12-a0e9-4d82-90c5-3b5beed10ab8\") " pod="openshift-ingress-canary/ingress-canary-8qb8j" Apr 16 22:14:13.934404 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:13.934010 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls\") pod \"dns-default-k6prk\" (UID: \"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0\") " pod="openshift-dns/dns-default-k6prk" Apr 16 22:14:13.934404 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:13.934122 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:13.934404 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:13.934183 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert podName:92752f12-a0e9-4d82-90c5-3b5beed10ab8 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:21.934168624 +0000 UTC m=+49.176984663 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert") pod "ingress-canary-8qb8j" (UID: "92752f12-a0e9-4d82-90c5-3b5beed10ab8") : secret "canary-serving-cert" not found Apr 16 22:14:13.934404 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:13.934122 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:13.934404 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:13.934268 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls podName:cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:21.934256018 +0000 UTC m=+49.177072076 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls") pod "dns-default-k6prk" (UID: "cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0") : secret "dns-default-metrics-tls" not found Apr 16 22:14:19.068442 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:19.068396 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c08f4349-6022-4892-a46d-87843f55329d-original-pull-secret\") pod \"global-pull-secret-syncer-dmn2w\" (UID: \"c08f4349-6022-4892-a46d-87843f55329d\") " pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:14:19.073816 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:19.073792 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c08f4349-6022-4892-a46d-87843f55329d-original-pull-secret\") pod \"global-pull-secret-syncer-dmn2w\" (UID: \"c08f4349-6022-4892-a46d-87843f55329d\") " pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:14:19.370780 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:19.370744 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmn2w" Apr 16 22:14:19.502572 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:19.502550 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dmn2w"] Apr 16 22:14:19.504690 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:14:19.504662 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc08f4349_6022_4892_a46d_87843f55329d.slice/crio-3f86a9bce4ef84b3d570c2760d0729afbb0695f9198e7c77848fd2b55590da50 WatchSource:0}: Error finding container 3f86a9bce4ef84b3d570c2760d0729afbb0695f9198e7c77848fd2b55590da50: Status 404 returned error can't find the container with id 3f86a9bce4ef84b3d570c2760d0729afbb0695f9198e7c77848fd2b55590da50 Apr 16 22:14:19.632044 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:19.631969 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dmn2w" event={"ID":"c08f4349-6022-4892-a46d-87843f55329d","Type":"ContainerStarted","Data":"3f86a9bce4ef84b3d570c2760d0729afbb0695f9198e7c77848fd2b55590da50"} Apr 16 22:14:20.050299 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.050224 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx"] Apr 16 22:14:20.053879 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.053860 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx" Apr 16 22:14:20.056885 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.056851 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 22:14:20.057167 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.057150 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 22:14:20.057748 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.057725 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 22:14:20.058417 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.058378 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 22:14:20.066226 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.066204 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz"] Apr 16 22:14:20.069772 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.069751 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx"] Apr 16 22:14:20.070122 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.069869 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" Apr 16 22:14:20.074538 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.074412 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 22:14:20.074538 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.074485 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 22:14:20.074701 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.074542 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 22:14:20.075174 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.075153 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 22:14:20.081353 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.081332 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz"] Apr 16 22:14:20.176432 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.176400 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/cdea09b8-38c2-462e-949e-6de287fee8bf-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-68f888494b-rxfbz\" (UID: \"cdea09b8-38c2-462e-949e-6de287fee8bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" Apr 16 22:14:20.176432 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.176434 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/cdea09b8-38c2-462e-949e-6de287fee8bf-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-68f888494b-rxfbz\" (UID: \"cdea09b8-38c2-462e-949e-6de287fee8bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" Apr 16 22:14:20.176666 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.176455 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/cdea09b8-38c2-462e-949e-6de287fee8bf-ca\") pod \"cluster-proxy-proxy-agent-68f888494b-rxfbz\" (UID: \"cdea09b8-38c2-462e-949e-6de287fee8bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" Apr 16 22:14:20.176666 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.176557 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcb7w\" (UniqueName: \"kubernetes.io/projected/cdea09b8-38c2-462e-949e-6de287fee8bf-kube-api-access-pcb7w\") pod \"cluster-proxy-proxy-agent-68f888494b-rxfbz\" (UID: \"cdea09b8-38c2-462e-949e-6de287fee8bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" Apr 16 22:14:20.176778 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.176662 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/961cf913-9711-4243-8973-bd9fccc22537-tmp\") pod \"klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx\" (UID: \"961cf913-9711-4243-8973-bd9fccc22537\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx" Apr 16 22:14:20.176778 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.176691 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/961cf913-9711-4243-8973-bd9fccc22537-klusterlet-config\") pod \"klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx\" (UID: \"961cf913-9711-4243-8973-bd9fccc22537\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx" Apr 16 22:14:20.176778 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.176730 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/cdea09b8-38c2-462e-949e-6de287fee8bf-hub\") pod \"cluster-proxy-proxy-agent-68f888494b-rxfbz\" (UID: \"cdea09b8-38c2-462e-949e-6de287fee8bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" Apr 16 22:14:20.176952 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.176791 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cdea09b8-38c2-462e-949e-6de287fee8bf-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-68f888494b-rxfbz\" (UID: \"cdea09b8-38c2-462e-949e-6de287fee8bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" Apr 16 22:14:20.176952 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.176816 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv8h7\" (UniqueName: \"kubernetes.io/projected/961cf913-9711-4243-8973-bd9fccc22537-kube-api-access-wv8h7\") pod \"klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx\" (UID: \"961cf913-9711-4243-8973-bd9fccc22537\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx" Apr 16 22:14:20.278025 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.277991 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/cdea09b8-38c2-462e-949e-6de287fee8bf-hub\") pod \"cluster-proxy-proxy-agent-68f888494b-rxfbz\" (UID: \"cdea09b8-38c2-462e-949e-6de287fee8bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" Apr 16 22:14:20.278196 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.278039 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cdea09b8-38c2-462e-949e-6de287fee8bf-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-68f888494b-rxfbz\" (UID: \"cdea09b8-38c2-462e-949e-6de287fee8bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" Apr 16 22:14:20.278196 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.278063 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wv8h7\" (UniqueName: \"kubernetes.io/projected/961cf913-9711-4243-8973-bd9fccc22537-kube-api-access-wv8h7\") pod \"klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx\" (UID: \"961cf913-9711-4243-8973-bd9fccc22537\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx" Apr 16 22:14:20.278196 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.278112 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/cdea09b8-38c2-462e-949e-6de287fee8bf-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-68f888494b-rxfbz\" (UID: \"cdea09b8-38c2-462e-949e-6de287fee8bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" Apr 16 22:14:20.278348 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.278272 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/cdea09b8-38c2-462e-949e-6de287fee8bf-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-68f888494b-rxfbz\" (UID: \"cdea09b8-38c2-462e-949e-6de287fee8bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" Apr 16 22:14:20.278348 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.278320 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/cdea09b8-38c2-462e-949e-6de287fee8bf-ca\") pod \"cluster-proxy-proxy-agent-68f888494b-rxfbz\" (UID: \"cdea09b8-38c2-462e-949e-6de287fee8bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" Apr 16 22:14:20.278459 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.278374 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcb7w\" (UniqueName: \"kubernetes.io/projected/cdea09b8-38c2-462e-949e-6de287fee8bf-kube-api-access-pcb7w\") pod \"cluster-proxy-proxy-agent-68f888494b-rxfbz\" (UID: \"cdea09b8-38c2-462e-949e-6de287fee8bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" Apr 16 22:14:20.278459 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.278435 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/961cf913-9711-4243-8973-bd9fccc22537-tmp\") pod \"klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx\" (UID: \"961cf913-9711-4243-8973-bd9fccc22537\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx" Apr 16 22:14:20.278565 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.278462 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/961cf913-9711-4243-8973-bd9fccc22537-klusterlet-config\") pod \"klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx\" (UID: \"961cf913-9711-4243-8973-bd9fccc22537\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx" Apr 16 22:14:20.279446 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.279339 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/961cf913-9711-4243-8973-bd9fccc22537-tmp\") pod \"klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx\" (UID: \"961cf913-9711-4243-8973-bd9fccc22537\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx" Apr 16 22:14:20.279577 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.279511 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/cdea09b8-38c2-462e-949e-6de287fee8bf-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-68f888494b-rxfbz\" (UID: \"cdea09b8-38c2-462e-949e-6de287fee8bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" Apr 16 22:14:20.281202 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.281178 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cdea09b8-38c2-462e-949e-6de287fee8bf-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-68f888494b-rxfbz\" (UID: \"cdea09b8-38c2-462e-949e-6de287fee8bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" Apr 16 22:14:20.281202 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.281191 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/cdea09b8-38c2-462e-949e-6de287fee8bf-hub\") pod \"cluster-proxy-proxy-agent-68f888494b-rxfbz\" (UID: \"cdea09b8-38c2-462e-949e-6de287fee8bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" Apr 16 22:14:20.281615 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.281598 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/cdea09b8-38c2-462e-949e-6de287fee8bf-ca\") pod \"cluster-proxy-proxy-agent-68f888494b-rxfbz\" (UID: \"cdea09b8-38c2-462e-949e-6de287fee8bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" Apr 16 22:14:20.281690 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.281671 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/cdea09b8-38c2-462e-949e-6de287fee8bf-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-68f888494b-rxfbz\" (UID: \"cdea09b8-38c2-462e-949e-6de287fee8bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" Apr 16 22:14:20.281768 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.281750 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/961cf913-9711-4243-8973-bd9fccc22537-klusterlet-config\") pod \"klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx\" (UID: \"961cf913-9711-4243-8973-bd9fccc22537\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx" Apr 16 22:14:20.286482 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.286460 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcb7w\" (UniqueName: \"kubernetes.io/projected/cdea09b8-38c2-462e-949e-6de287fee8bf-kube-api-access-pcb7w\") pod \"cluster-proxy-proxy-agent-68f888494b-rxfbz\" (UID: \"cdea09b8-38c2-462e-949e-6de287fee8bf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" Apr 16 22:14:20.286734 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.286716 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv8h7\" (UniqueName: \"kubernetes.io/projected/961cf913-9711-4243-8973-bd9fccc22537-kube-api-access-wv8h7\") pod \"klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx\" (UID: \"961cf913-9711-4243-8973-bd9fccc22537\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx" Apr 16 22:14:20.369738 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.369709 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx" Apr 16 22:14:20.390044 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.390014 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" Apr 16 22:14:20.531496 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.531463 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx"] Apr 16 22:14:20.535107 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:14:20.535070 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod961cf913_9711_4243_8973_bd9fccc22537.slice/crio-b9c19a09f152f3ae33cb19f14f187ffcd283dbb6d69384a853e74df3861da17e WatchSource:0}: Error finding container b9c19a09f152f3ae33cb19f14f187ffcd283dbb6d69384a853e74df3861da17e: Status 404 returned error can't find the container with id b9c19a09f152f3ae33cb19f14f187ffcd283dbb6d69384a853e74df3861da17e Apr 16 22:14:20.552377 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.552354 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz"] Apr 16 22:14:20.557073 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:14:20.557045 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdea09b8_38c2_462e_949e_6de287fee8bf.slice/crio-768a5a26bff6765b6ab42c47c5c27170fbe6b9aaaa5c29a98b92079e26b2d8ee WatchSource:0}: Error finding container 768a5a26bff6765b6ab42c47c5c27170fbe6b9aaaa5c29a98b92079e26b2d8ee: Status 404 returned error can't find the container with id 768a5a26bff6765b6ab42c47c5c27170fbe6b9aaaa5c29a98b92079e26b2d8ee Apr 16 22:14:20.635139 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.635055 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx" event={"ID":"961cf913-9711-4243-8973-bd9fccc22537","Type":"ContainerStarted","Data":"b9c19a09f152f3ae33cb19f14f187ffcd283dbb6d69384a853e74df3861da17e"} Apr 16 22:14:20.636117 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:20.636084 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" event={"ID":"cdea09b8-38c2-462e-949e-6de287fee8bf","Type":"ContainerStarted","Data":"768a5a26bff6765b6ab42c47c5c27170fbe6b9aaaa5c29a98b92079e26b2d8ee"} Apr 16 22:14:21.992249 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:21.991465 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert\") pod \"ingress-canary-8qb8j\" (UID: \"92752f12-a0e9-4d82-90c5-3b5beed10ab8\") " pod="openshift-ingress-canary/ingress-canary-8qb8j" Apr 16 22:14:21.992249 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:21.991516 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls\") pod \"dns-default-k6prk\" (UID: \"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0\") " pod="openshift-dns/dns-default-k6prk" Apr 16 22:14:21.992249 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:21.991685 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:21.992249 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:21.991749 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls podName:cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:37.991728546 +0000 UTC m=+65.234544608 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls") pod "dns-default-k6prk" (UID: "cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0") : secret "dns-default-metrics-tls" not found Apr 16 22:14:21.992249 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:21.992168 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:21.992249 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:21.992214 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert podName:92752f12-a0e9-4d82-90c5-3b5beed10ab8 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:37.992198665 +0000 UTC m=+65.235014717 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert") pod "ingress-canary-8qb8j" (UID: "92752f12-a0e9-4d82-90c5-3b5beed10ab8") : secret "canary-serving-cert" not found Apr 16 22:14:27.652360 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:27.652322 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx" event={"ID":"961cf913-9711-4243-8973-bd9fccc22537","Type":"ContainerStarted","Data":"c2ede441087a2788d9925734b95a9a3ff197550d11423ccedb5ea3a67601fa51"} Apr 16 22:14:27.652842 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:27.652475 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx" Apr 16 22:14:27.653592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:27.653563 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" event={"ID":"cdea09b8-38c2-462e-949e-6de287fee8bf","Type":"ContainerStarted","Data":"bd94b9c1ca65c639ea79bbe273aa5dd3b9d743406b2c95f44861cf43ce93136b"} Apr 16 22:14:27.654358 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:27.654337 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx" Apr 16 22:14:27.654854 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:27.654834 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dmn2w" event={"ID":"c08f4349-6022-4892-a46d-87843f55329d","Type":"ContainerStarted","Data":"0acc7b1ef5b484a79004a4254df0c291d0da471064a2a20bca2cc521fa818f94"} Apr 16 22:14:27.670290 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:27.669562 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx" podStartSLOduration=1.16851702 podStartE2EDuration="7.669541832s" podCreationTimestamp="2026-04-16 22:14:20 +0000 UTC" firstStartedPulling="2026-04-16 22:14:20.537372251 +0000 UTC m=+47.780188290" lastFinishedPulling="2026-04-16 22:14:27.038397058 +0000 UTC m=+54.281213102" observedRunningTime="2026-04-16 22:14:27.667015691 +0000 UTC m=+54.909831753" watchObservedRunningTime="2026-04-16 22:14:27.669541832 +0000 UTC m=+54.912357897" Apr 16 22:14:27.683675 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:27.683635 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-dmn2w" podStartSLOduration=33.194250022 podStartE2EDuration="40.68362581s" podCreationTimestamp="2026-04-16 22:13:47 +0000 UTC" firstStartedPulling="2026-04-16 22:14:19.506379961 +0000 UTC m=+46.749196000" lastFinishedPulling="2026-04-16 22:14:26.995755749 +0000 UTC m=+54.238571788" observedRunningTime="2026-04-16 22:14:27.682683478 +0000 UTC m=+54.925499536" watchObservedRunningTime="2026-04-16 22:14:27.68362581 +0000 UTC m=+54.926441871" Apr 16 22:14:31.665122 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:31.665088 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" event={"ID":"cdea09b8-38c2-462e-949e-6de287fee8bf","Type":"ContainerStarted","Data":"f1df5c95ee87c1f58b2f89a6efc8b0949049f809f30a5adcfdc3a2ff97b3d4c5"} Apr 16 22:14:31.665122 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:31.665124 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" event={"ID":"cdea09b8-38c2-462e-949e-6de287fee8bf","Type":"ContainerStarted","Data":"6b3f24915d209c68870fa172a5eccb5a457aaf992bb063363cdb0da1df8cd7db"} Apr 16 22:14:31.681621 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:31.681579 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" podStartSLOduration=1.599921664 podStartE2EDuration="11.681565409s" podCreationTimestamp="2026-04-16 22:14:20 +0000 UTC" firstStartedPulling="2026-04-16 22:14:20.559068639 +0000 UTC m=+47.801884684" lastFinishedPulling="2026-04-16 22:14:30.640712378 +0000 UTC m=+57.883528429" observedRunningTime="2026-04-16 22:14:31.680417563 +0000 UTC m=+58.923233626" watchObservedRunningTime="2026-04-16 22:14:31.681565409 +0000 UTC m=+58.924381469" Apr 16 22:14:33.567923 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:33.567898 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z2ds7" Apr 16 22:14:38.005381 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:38.005337 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert\") pod \"ingress-canary-8qb8j\" (UID: \"92752f12-a0e9-4d82-90c5-3b5beed10ab8\") " pod="openshift-ingress-canary/ingress-canary-8qb8j" Apr 16 22:14:38.005381 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:38.005386 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls\") pod \"dns-default-k6prk\" (UID: \"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0\") " pod="openshift-dns/dns-default-k6prk" Apr 16 22:14:38.005858 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:38.005502 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:38.005858 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:38.005505 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:38.005858 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:38.005570 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls podName:cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:10.00555369 +0000 UTC m=+97.248369733 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls") pod "dns-default-k6prk" (UID: "cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0") : secret "dns-default-metrics-tls" not found Apr 16 22:14:38.005858 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:38.005584 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert podName:92752f12-a0e9-4d82-90c5-3b5beed10ab8 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:10.005578203 +0000 UTC m=+97.248394241 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert") pod "ingress-canary-8qb8j" (UID: "92752f12-a0e9-4d82-90c5-3b5beed10ab8") : secret "canary-serving-cert" not found Apr 16 22:14:39.114438 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:39.114385 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs\") pod \"network-metrics-daemon-knqmk\" (UID: \"cc045530-7e0f-412e-98ba-915fe7aa6d22\") " pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:14:39.114824 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:39.114527 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:14:39.114824 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:14:39.114597 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs podName:cc045530-7e0f-412e-98ba-915fe7aa6d22 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:43.11458067 +0000 UTC m=+130.357396712 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs") pod "network-metrics-daemon-knqmk" (UID: "cc045530-7e0f-412e-98ba-915fe7aa6d22") : secret "metrics-daemon-secret" not found Apr 16 22:14:42.620960 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:14:42.620845 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-4g7hv" Apr 16 22:15:10.038151 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:15:10.038116 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls\") pod \"dns-default-k6prk\" (UID: \"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0\") " pod="openshift-dns/dns-default-k6prk" Apr 16 22:15:10.038589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:15:10.038253 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert\") pod \"ingress-canary-8qb8j\" (UID: \"92752f12-a0e9-4d82-90c5-3b5beed10ab8\") " pod="openshift-ingress-canary/ingress-canary-8qb8j" Apr 16 22:15:10.038589 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:15:10.038263 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:15:10.038589 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:15:10.038327 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:15:10.038589 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:15:10.038333 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls podName:cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:14.038319003 +0000 UTC m=+161.281135042 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls") pod "dns-default-k6prk" (UID: "cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0") : secret "dns-default-metrics-tls" not found Apr 16 22:15:10.038589 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:15:10.038372 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert podName:92752f12-a0e9-4d82-90c5-3b5beed10ab8 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:14.038360189 +0000 UTC m=+161.281176228 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert") pod "ingress-canary-8qb8j" (UID: "92752f12-a0e9-4d82-90c5-3b5beed10ab8") : secret "canary-serving-cert" not found Apr 16 22:15:43.173715 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:15:43.173666 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs\") pod \"network-metrics-daemon-knqmk\" (UID: \"cc045530-7e0f-412e-98ba-915fe7aa6d22\") " pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:15:43.174231 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:15:43.173815 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:15:43.174231 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:15:43.173899 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs podName:cc045530-7e0f-412e-98ba-915fe7aa6d22 nodeName:}" failed. No retries permitted until 2026-04-16 22:17:45.173879514 +0000 UTC m=+252.416695573 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs") pod "network-metrics-daemon-knqmk" (UID: "cc045530-7e0f-412e-98ba-915fe7aa6d22") : secret "metrics-daemon-secret" not found Apr 16 22:16:06.283245 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:06.283218 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-dm8zf_d2e4bc53-aead-430d-aaf8-6def343926ef/dns-node-resolver/0.log" Apr 16 22:16:07.087846 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:07.087819 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2grf8_9614d8df-9bb5-4a22-a608-e18aa7fb1162/node-ca/0.log" Apr 16 22:16:09.176331 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:16:09.176280 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-8qb8j" podUID="92752f12-a0e9-4d82-90c5-3b5beed10ab8" Apr 16 22:16:09.191466 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:16:09.191437 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-k6prk" podUID="cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0" Apr 16 22:16:09.455889 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:16:09.455810 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-knqmk" podUID="cc045530-7e0f-412e-98ba-915fe7aa6d22" Apr 16 22:16:09.895621 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:09.895592 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8qb8j" Apr 16 22:16:14.082888 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:14.082798 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert\") pod \"ingress-canary-8qb8j\" (UID: \"92752f12-a0e9-4d82-90c5-3b5beed10ab8\") " pod="openshift-ingress-canary/ingress-canary-8qb8j" Apr 16 22:16:14.082888 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:14.082834 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls\") pod \"dns-default-k6prk\" (UID: \"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0\") " pod="openshift-dns/dns-default-k6prk" Apr 16 22:16:14.085081 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:14.085062 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0-metrics-tls\") pod \"dns-default-k6prk\" (UID: \"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0\") " pod="openshift-dns/dns-default-k6prk" Apr 16 22:16:14.085227 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:14.085209 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92752f12-a0e9-4d82-90c5-3b5beed10ab8-cert\") pod \"ingress-canary-8qb8j\" (UID: \"92752f12-a0e9-4d82-90c5-3b5beed10ab8\") " pod="openshift-ingress-canary/ingress-canary-8qb8j" Apr 16 22:16:14.098492 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:14.098470 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-b4hjg\"" Apr 16 22:16:14.106218 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:14.106205 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8qb8j" Apr 16 22:16:14.218121 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:14.218084 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8qb8j"] Apr 16 22:16:14.222400 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:16:14.222366 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92752f12_a0e9_4d82_90c5_3b5beed10ab8.slice/crio-423343e23629f6e747e87e4b9d259e2118beb8dca1d711183138044752e1fb5b WatchSource:0}: Error finding container 423343e23629f6e747e87e4b9d259e2118beb8dca1d711183138044752e1fb5b: Status 404 returned error can't find the container with id 423343e23629f6e747e87e4b9d259e2118beb8dca1d711183138044752e1fb5b Apr 16 22:16:14.908664 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:14.908628 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8qb8j" event={"ID":"92752f12-a0e9-4d82-90c5-3b5beed10ab8","Type":"ContainerStarted","Data":"423343e23629f6e747e87e4b9d259e2118beb8dca1d711183138044752e1fb5b"} Apr 16 22:16:15.912704 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:15.912670 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8qb8j" event={"ID":"92752f12-a0e9-4d82-90c5-3b5beed10ab8","Type":"ContainerStarted","Data":"0b3ad88f395300400efdb38d51610b0152d25fb532053bc12345cf78f7509f49"} Apr 16 22:16:15.930105 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:15.930056 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8qb8j" podStartSLOduration=128.388921465 podStartE2EDuration="2m9.930040556s" podCreationTimestamp="2026-04-16 22:14:06 +0000 UTC" firstStartedPulling="2026-04-16 22:16:14.223780668 +0000 UTC m=+161.466596708" lastFinishedPulling="2026-04-16 22:16:15.764899755 +0000 UTC m=+163.007715799" observedRunningTime="2026-04-16 22:16:15.929165412 +0000 UTC m=+163.171981472" watchObservedRunningTime="2026-04-16 22:16:15.930040556 +0000 UTC m=+163.172856617" Apr 16 22:16:20.444371 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:20.444340 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:16:22.444865 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:22.444825 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k6prk" Apr 16 22:16:22.447916 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:22.447898 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-66k94\"" Apr 16 22:16:22.456158 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:22.456140 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k6prk" Apr 16 22:16:22.568417 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:22.568388 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-k6prk"] Apr 16 22:16:22.571233 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:16:22.571205 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcefc3c86_7e52_4b0c_8dfc_08cf48cb79f0.slice/crio-ac7400f4397ee80a723595cd1320bba1eb12d9ef11d77e19c2b257b556fd6d48 WatchSource:0}: Error finding container ac7400f4397ee80a723595cd1320bba1eb12d9ef11d77e19c2b257b556fd6d48: Status 404 returned error can't find the container with id ac7400f4397ee80a723595cd1320bba1eb12d9ef11d77e19c2b257b556fd6d48 Apr 16 22:16:22.928077 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:22.928046 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k6prk" event={"ID":"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0","Type":"ContainerStarted","Data":"ac7400f4397ee80a723595cd1320bba1eb12d9ef11d77e19c2b257b556fd6d48"} Apr 16 22:16:23.932175 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:23.932129 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k6prk" event={"ID":"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0","Type":"ContainerStarted","Data":"1d0764be6e108a86c3bd70483043e723ec2876d3985f8115e757a9987a85155a"} Apr 16 22:16:24.936491 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:24.936445 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k6prk" event={"ID":"cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0","Type":"ContainerStarted","Data":"67161a3f6cce4b78b6f0280587307a84ad9572996f4007ff87d88c6f1eb59d05"} Apr 16 22:16:24.936884 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:24.936576 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-k6prk" Apr 16 22:16:24.957689 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:24.957644 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-k6prk" podStartSLOduration=137.756147432 podStartE2EDuration="2m18.957632765s" podCreationTimestamp="2026-04-16 22:14:06 +0000 UTC" firstStartedPulling="2026-04-16 22:16:22.572900693 +0000 UTC m=+169.815716732" lastFinishedPulling="2026-04-16 22:16:23.774386017 +0000 UTC m=+171.017202065" observedRunningTime="2026-04-16 22:16:24.955949527 +0000 UTC m=+172.198765582" watchObservedRunningTime="2026-04-16 22:16:24.957632765 +0000 UTC m=+172.200448803" Apr 16 22:16:25.742229 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.742197 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-kp5wn"] Apr 16 22:16:25.745850 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.745828 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kp5wn" Apr 16 22:16:25.748512 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.748488 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 22:16:25.748832 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.748728 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 22:16:25.749302 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.749283 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 22:16:25.749447 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.749297 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-hc5p5\"" Apr 16 22:16:25.750001 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.749981 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 22:16:25.768152 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.768128 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kp5wn"] Apr 16 22:16:25.826733 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.826701 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7ffcd7d997-ppg7h"] Apr 16 22:16:25.829667 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.829652 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.833812 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.833793 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 22:16:25.833981 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.833836 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 22:16:25.834401 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.834381 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 22:16:25.834777 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.834760 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8tzbx\"" Apr 16 22:16:25.849823 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.846795 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 22:16:25.861477 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.861448 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7ffcd7d997-ppg7h"] Apr 16 22:16:25.865426 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.865405 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-registry-certificates\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.865531 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.865434 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-trusted-ca\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.865531 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.865465 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-registry-tls\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.865531 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.865514 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0-crio-socket\") pod \"insights-runtime-extractor-kp5wn\" (UID: \"2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0\") " pod="openshift-insights/insights-runtime-extractor-kp5wn" Apr 16 22:16:25.865677 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.865533 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0-data-volume\") pod \"insights-runtime-extractor-kp5wn\" (UID: \"2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0\") " pod="openshift-insights/insights-runtime-extractor-kp5wn" Apr 16 22:16:25.865677 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.865548 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kp5wn\" (UID: \"2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0\") " pod="openshift-insights/insights-runtime-extractor-kp5wn" Apr 16 22:16:25.865677 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.865566 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-bound-sa-token\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.865677 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.865599 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92lxr\" (UniqueName: \"kubernetes.io/projected/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-kube-api-access-92lxr\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.865677 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.865628 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-ca-trust-extracted\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.865918 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.865684 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kp5wn\" (UID: \"2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0\") " pod="openshift-insights/insights-runtime-extractor-kp5wn" Apr 16 22:16:25.865918 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.865723 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-image-registry-private-configuration\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.865918 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.865824 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-installation-pull-secrets\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.865918 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.865864 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtfb6\" (UniqueName: \"kubernetes.io/projected/2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0-kube-api-access-mtfb6\") pod \"insights-runtime-extractor-kp5wn\" (UID: \"2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0\") " pod="openshift-insights/insights-runtime-extractor-kp5wn" Apr 16 22:16:25.966448 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.966419 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kp5wn\" (UID: \"2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0\") " pod="openshift-insights/insights-runtime-extractor-kp5wn" Apr 16 22:16:25.966448 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.966452 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-image-registry-private-configuration\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.966990 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.966474 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-installation-pull-secrets\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.966990 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.966742 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtfb6\" (UniqueName: \"kubernetes.io/projected/2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0-kube-api-access-mtfb6\") pod \"insights-runtime-extractor-kp5wn\" (UID: \"2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0\") " pod="openshift-insights/insights-runtime-extractor-kp5wn" Apr 16 22:16:25.966990 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.966819 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-registry-certificates\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.966990 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.966849 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-trusted-ca\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.966990 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.966878 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-registry-tls\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.966990 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.966921 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0-crio-socket\") pod \"insights-runtime-extractor-kp5wn\" (UID: \"2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0\") " pod="openshift-insights/insights-runtime-extractor-kp5wn" Apr 16 22:16:25.966990 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.966965 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0-data-volume\") pod \"insights-runtime-extractor-kp5wn\" (UID: \"2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0\") " pod="openshift-insights/insights-runtime-extractor-kp5wn" Apr 16 22:16:25.966990 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.966989 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kp5wn\" (UID: \"2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0\") " pod="openshift-insights/insights-runtime-extractor-kp5wn" Apr 16 22:16:25.967394 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.967023 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-bound-sa-token\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.967394 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.967074 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92lxr\" (UniqueName: \"kubernetes.io/projected/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-kube-api-access-92lxr\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.967394 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.967110 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-ca-trust-extracted\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.967394 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.967351 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0-data-volume\") pod \"insights-runtime-extractor-kp5wn\" (UID: \"2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0\") " pod="openshift-insights/insights-runtime-extractor-kp5wn" Apr 16 22:16:25.967595 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.967498 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-ca-trust-extracted\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.967654 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.967643 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0-crio-socket\") pod \"insights-runtime-extractor-kp5wn\" (UID: \"2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0\") " pod="openshift-insights/insights-runtime-extractor-kp5wn" Apr 16 22:16:25.967811 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.967782 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-registry-certificates\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.967965 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.967856 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kp5wn\" (UID: \"2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0\") " pod="openshift-insights/insights-runtime-extractor-kp5wn" Apr 16 22:16:25.968068 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.968048 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-trusted-ca\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.969339 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.969307 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kp5wn\" (UID: \"2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0\") " pod="openshift-insights/insights-runtime-extractor-kp5wn" Apr 16 22:16:25.969475 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.969445 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-image-registry-private-configuration\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.969555 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.969520 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-installation-pull-secrets\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.969913 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.969890 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-registry-tls\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.979751 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.979732 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92lxr\" (UniqueName: \"kubernetes.io/projected/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-kube-api-access-92lxr\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.979844 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.979803 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7-bound-sa-token\") pod \"image-registry-7ffcd7d997-ppg7h\" (UID: \"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7\") " pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:25.987321 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:25.987297 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtfb6\" (UniqueName: \"kubernetes.io/projected/2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0-kube-api-access-mtfb6\") pod \"insights-runtime-extractor-kp5wn\" (UID: \"2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0\") " pod="openshift-insights/insights-runtime-extractor-kp5wn" Apr 16 22:16:26.056710 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:26.056628 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kp5wn" Apr 16 22:16:26.138768 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:26.138739 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:26.186039 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:26.186008 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kp5wn"] Apr 16 22:16:26.192567 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:16:26.191532 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b3b2f9b_76cb_4dab_9ed1_fb642c3531d0.slice/crio-382eaba1860449a1df2486bce4d9c99748efd68ef90f8c03c3573486476dd698 WatchSource:0}: Error finding container 382eaba1860449a1df2486bce4d9c99748efd68ef90f8c03c3573486476dd698: Status 404 returned error can't find the container with id 382eaba1860449a1df2486bce4d9c99748efd68ef90f8c03c3573486476dd698 Apr 16 22:16:26.264171 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:26.264140 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7ffcd7d997-ppg7h"] Apr 16 22:16:26.266846 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:16:26.266819 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9edb9f95_1b7c_4a3d_81f4_b34dc212e4c7.slice/crio-4ba60424caccbb416c95e331809386e54ebce2d18808367c63ecfbff53cca9ed WatchSource:0}: Error finding container 4ba60424caccbb416c95e331809386e54ebce2d18808367c63ecfbff53cca9ed: Status 404 returned error can't find the container with id 4ba60424caccbb416c95e331809386e54ebce2d18808367c63ecfbff53cca9ed Apr 16 22:16:26.943649 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:26.943616 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" event={"ID":"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7","Type":"ContainerStarted","Data":"b20067c45b8ed20ab992ccbfd424455246ba6cb219f2922b610f07c42bfc747d"} Apr 16 22:16:26.943649 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:26.943654 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" event={"ID":"9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7","Type":"ContainerStarted","Data":"4ba60424caccbb416c95e331809386e54ebce2d18808367c63ecfbff53cca9ed"} Apr 16 22:16:26.943906 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:26.943743 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:26.945097 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:26.945075 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kp5wn" event={"ID":"2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0","Type":"ContainerStarted","Data":"17115e6568aea6f906c81042fbc405f8934eda5eb32444d66888b395755b8ad6"} Apr 16 22:16:26.945193 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:26.945102 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kp5wn" event={"ID":"2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0","Type":"ContainerStarted","Data":"a1967a7383b81e499951266045f80a2b189bf15a28a81ea4a9e3751730ae7ecc"} Apr 16 22:16:26.945193 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:26.945116 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kp5wn" event={"ID":"2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0","Type":"ContainerStarted","Data":"382eaba1860449a1df2486bce4d9c99748efd68ef90f8c03c3573486476dd698"} Apr 16 22:16:26.961999 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:26.961961 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" podStartSLOduration=1.9619491930000001 podStartE2EDuration="1.961949193s" podCreationTimestamp="2026-04-16 22:16:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:16:26.961092148 +0000 UTC m=+174.203908411" watchObservedRunningTime="2026-04-16 22:16:26.961949193 +0000 UTC m=+174.204765246" Apr 16 22:16:27.653173 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:27.653127 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx" podUID="961cf913-9711-4243-8973-bd9fccc22537" containerName="acm-agent" probeResult="failure" output="Get \"http://10.133.0.8:8000/readyz\": dial tcp 10.133.0.8:8000: connect: connection refused" Apr 16 22:16:27.949016 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:27.948886 2574 generic.go:358] "Generic (PLEG): container finished" podID="961cf913-9711-4243-8973-bd9fccc22537" containerID="c2ede441087a2788d9925734b95a9a3ff197550d11423ccedb5ea3a67601fa51" exitCode=1 Apr 16 22:16:27.949016 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:27.948959 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx" event={"ID":"961cf913-9711-4243-8973-bd9fccc22537","Type":"ContainerDied","Data":"c2ede441087a2788d9925734b95a9a3ff197550d11423ccedb5ea3a67601fa51"} Apr 16 22:16:27.949430 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:27.949409 2574 scope.go:117] "RemoveContainer" containerID="c2ede441087a2788d9925734b95a9a3ff197550d11423ccedb5ea3a67601fa51" Apr 16 22:16:28.952900 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:28.952864 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx" event={"ID":"961cf913-9711-4243-8973-bd9fccc22537","Type":"ContainerStarted","Data":"b00f42c213aeccf6e64cffd3b5f5e628c25c4bf72f3f08da717589a4b59bca99"} Apr 16 22:16:28.953377 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:28.953188 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx" Apr 16 22:16:28.953853 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:28.953831 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5f5ddd7b-g4pxx" Apr 16 22:16:28.954665 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:28.954644 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kp5wn" event={"ID":"2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0","Type":"ContainerStarted","Data":"3847da1ce3f476cfc85aa788671738ead3f6825a8265970bffe530263893008b"} Apr 16 22:16:29.022513 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:29.022462 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-kp5wn" podStartSLOduration=2.179093401 podStartE2EDuration="4.022445196s" podCreationTimestamp="2026-04-16 22:16:25 +0000 UTC" firstStartedPulling="2026-04-16 22:16:26.259097118 +0000 UTC m=+173.501913156" lastFinishedPulling="2026-04-16 22:16:28.102448898 +0000 UTC m=+175.345264951" observedRunningTime="2026-04-16 22:16:29.022384199 +0000 UTC m=+176.265200261" watchObservedRunningTime="2026-04-16 22:16:29.022445196 +0000 UTC m=+176.265261257" Apr 16 22:16:33.377715 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.377688 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-rlbd6"] Apr 16 22:16:33.382331 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.382314 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.384645 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.384620 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 22:16:33.384817 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.384797 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 22:16:33.385163 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.385143 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 22:16:33.385254 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.385174 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 22:16:33.385890 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.385865 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mfcw8\"" Apr 16 22:16:33.385890 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.385884 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 22:16:33.386042 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.385904 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 22:16:33.427529 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.427499 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14853ac7-bf90-4539-8a3a-0f4dc64657ce-sys\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.427667 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.427536 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/14853ac7-bf90-4539-8a3a-0f4dc64657ce-node-exporter-tls\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.427667 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.427581 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds4cv\" (UniqueName: \"kubernetes.io/projected/14853ac7-bf90-4539-8a3a-0f4dc64657ce-kube-api-access-ds4cv\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.427762 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.427683 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/14853ac7-bf90-4539-8a3a-0f4dc64657ce-root\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.427762 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.427715 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14853ac7-bf90-4539-8a3a-0f4dc64657ce-metrics-client-ca\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.427762 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.427742 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14853ac7-bf90-4539-8a3a-0f4dc64657ce-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.427901 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.427826 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/14853ac7-bf90-4539-8a3a-0f4dc64657ce-node-exporter-wtmp\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.427901 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.427885 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/14853ac7-bf90-4539-8a3a-0f4dc64657ce-node-exporter-textfile\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.428027 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.427910 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/14853ac7-bf90-4539-8a3a-0f4dc64657ce-node-exporter-accelerators-collector-config\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.528372 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.528341 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/14853ac7-bf90-4539-8a3a-0f4dc64657ce-node-exporter-textfile\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.528372 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.528374 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/14853ac7-bf90-4539-8a3a-0f4dc64657ce-node-exporter-accelerators-collector-config\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.528574 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.528394 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14853ac7-bf90-4539-8a3a-0f4dc64657ce-sys\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.528574 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.528412 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/14853ac7-bf90-4539-8a3a-0f4dc64657ce-node-exporter-tls\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.528574 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.528427 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ds4cv\" (UniqueName: \"kubernetes.io/projected/14853ac7-bf90-4539-8a3a-0f4dc64657ce-kube-api-access-ds4cv\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.528574 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.528488 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14853ac7-bf90-4539-8a3a-0f4dc64657ce-sys\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.528574 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.528560 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/14853ac7-bf90-4539-8a3a-0f4dc64657ce-root\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.528832 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.528586 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14853ac7-bf90-4539-8a3a-0f4dc64657ce-metrics-client-ca\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.528832 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.528623 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14853ac7-bf90-4539-8a3a-0f4dc64657ce-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.528832 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.528655 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/14853ac7-bf90-4539-8a3a-0f4dc64657ce-root\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.528832 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.528679 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/14853ac7-bf90-4539-8a3a-0f4dc64657ce-node-exporter-wtmp\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.528832 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.528756 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/14853ac7-bf90-4539-8a3a-0f4dc64657ce-node-exporter-textfile\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.529107 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.528833 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/14853ac7-bf90-4539-8a3a-0f4dc64657ce-node-exporter-wtmp\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.529162 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.529134 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14853ac7-bf90-4539-8a3a-0f4dc64657ce-metrics-client-ca\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.529211 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.529139 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/14853ac7-bf90-4539-8a3a-0f4dc64657ce-node-exporter-accelerators-collector-config\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.530818 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.530789 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/14853ac7-bf90-4539-8a3a-0f4dc64657ce-node-exporter-tls\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.530902 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.530837 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14853ac7-bf90-4539-8a3a-0f4dc64657ce-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.536459 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.536436 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds4cv\" (UniqueName: \"kubernetes.io/projected/14853ac7-bf90-4539-8a3a-0f4dc64657ce-kube-api-access-ds4cv\") pod \"node-exporter-rlbd6\" (UID: \"14853ac7-bf90-4539-8a3a-0f4dc64657ce\") " pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.692082 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.692007 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rlbd6" Apr 16 22:16:33.699706 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:16:33.699671 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14853ac7_bf90_4539_8a3a_0f4dc64657ce.slice/crio-725e11513e88a510c6294e8e39c7036340b942a9779ac77d1286053bbe41f621 WatchSource:0}: Error finding container 725e11513e88a510c6294e8e39c7036340b942a9779ac77d1286053bbe41f621: Status 404 returned error can't find the container with id 725e11513e88a510c6294e8e39c7036340b942a9779ac77d1286053bbe41f621 Apr 16 22:16:33.969441 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:33.969345 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rlbd6" event={"ID":"14853ac7-bf90-4539-8a3a-0f4dc64657ce","Type":"ContainerStarted","Data":"725e11513e88a510c6294e8e39c7036340b942a9779ac77d1286053bbe41f621"} Apr 16 22:16:34.941419 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:34.941384 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-k6prk" Apr 16 22:16:34.974915 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:34.974868 2574 generic.go:358] "Generic (PLEG): container finished" podID="14853ac7-bf90-4539-8a3a-0f4dc64657ce" containerID="618463c501ba3e40e532feea532de675e05a5fd692f581b717a362f5fceb3605" exitCode=0 Apr 16 22:16:34.975100 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:34.974979 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rlbd6" event={"ID":"14853ac7-bf90-4539-8a3a-0f4dc64657ce","Type":"ContainerDied","Data":"618463c501ba3e40e532feea532de675e05a5fd692f581b717a362f5fceb3605"} Apr 16 22:16:35.979460 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:35.979420 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rlbd6" event={"ID":"14853ac7-bf90-4539-8a3a-0f4dc64657ce","Type":"ContainerStarted","Data":"9e08ae975eab0e07642cead7074b2705f180517a6ecd0d6b799e9fb9349d4618"} Apr 16 22:16:35.979460 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:35.979461 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rlbd6" event={"ID":"14853ac7-bf90-4539-8a3a-0f4dc64657ce","Type":"ContainerStarted","Data":"b99fa2cd7a1c6d82e15dce3ff58fa239d5fbf8719bba33fae93fda3856429911"} Apr 16 22:16:36.057521 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:36.057464 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-rlbd6" podStartSLOduration=2.372682651 podStartE2EDuration="3.057447207s" podCreationTimestamp="2026-04-16 22:16:33 +0000 UTC" firstStartedPulling="2026-04-16 22:16:33.701607717 +0000 UTC m=+180.944423769" lastFinishedPulling="2026-04-16 22:16:34.386372283 +0000 UTC m=+181.629188325" observedRunningTime="2026-04-16 22:16:36.057341574 +0000 UTC m=+183.300157651" watchObservedRunningTime="2026-04-16 22:16:36.057447207 +0000 UTC m=+183.300263269" Apr 16 22:16:37.771069 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.771039 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-574cb6b97d-cmv9h"] Apr 16 22:16:37.774085 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.774070 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:37.776435 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.776414 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 22:16:37.777623 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.777591 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 22:16:37.777623 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.777591 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 22:16:37.777812 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.777596 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-38pg4g8tvub65\"" Apr 16 22:16:37.777812 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.777651 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-6mfjs\"" Apr 16 22:16:37.778251 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.778236 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 22:16:37.785199 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.785176 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-574cb6b97d-cmv9h"] Apr 16 22:16:37.861681 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.861654 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c4166b9-1a37-4b4a-bce9-31afc91645a4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-574cb6b97d-cmv9h\" (UID: \"1c4166b9-1a37-4b4a-bce9-31afc91645a4\") " pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:37.861681 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.861686 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1c4166b9-1a37-4b4a-bce9-31afc91645a4-metrics-server-audit-profiles\") pod \"metrics-server-574cb6b97d-cmv9h\" (UID: \"1c4166b9-1a37-4b4a-bce9-31afc91645a4\") " pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:37.861902 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.861704 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4166b9-1a37-4b4a-bce9-31afc91645a4-client-ca-bundle\") pod \"metrics-server-574cb6b97d-cmv9h\" (UID: \"1c4166b9-1a37-4b4a-bce9-31afc91645a4\") " pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:37.861902 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.861723 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/1c4166b9-1a37-4b4a-bce9-31afc91645a4-secret-metrics-server-client-certs\") pod \"metrics-server-574cb6b97d-cmv9h\" (UID: \"1c4166b9-1a37-4b4a-bce9-31afc91645a4\") " pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:37.861902 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.861788 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1c4166b9-1a37-4b4a-bce9-31afc91645a4-audit-log\") pod \"metrics-server-574cb6b97d-cmv9h\" (UID: \"1c4166b9-1a37-4b4a-bce9-31afc91645a4\") " pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:37.861902 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.861839 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1c4166b9-1a37-4b4a-bce9-31afc91645a4-secret-metrics-server-tls\") pod \"metrics-server-574cb6b97d-cmv9h\" (UID: \"1c4166b9-1a37-4b4a-bce9-31afc91645a4\") " pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:37.862083 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.861902 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rg4l\" (UniqueName: \"kubernetes.io/projected/1c4166b9-1a37-4b4a-bce9-31afc91645a4-kube-api-access-9rg4l\") pod \"metrics-server-574cb6b97d-cmv9h\" (UID: \"1c4166b9-1a37-4b4a-bce9-31afc91645a4\") " pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:37.962696 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.962665 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/1c4166b9-1a37-4b4a-bce9-31afc91645a4-secret-metrics-server-client-certs\") pod \"metrics-server-574cb6b97d-cmv9h\" (UID: \"1c4166b9-1a37-4b4a-bce9-31afc91645a4\") " pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:37.962838 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.962705 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1c4166b9-1a37-4b4a-bce9-31afc91645a4-audit-log\") pod \"metrics-server-574cb6b97d-cmv9h\" (UID: \"1c4166b9-1a37-4b4a-bce9-31afc91645a4\") " pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:37.962838 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.962723 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1c4166b9-1a37-4b4a-bce9-31afc91645a4-secret-metrics-server-tls\") pod \"metrics-server-574cb6b97d-cmv9h\" (UID: \"1c4166b9-1a37-4b4a-bce9-31afc91645a4\") " pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:37.962838 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.962775 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rg4l\" (UniqueName: \"kubernetes.io/projected/1c4166b9-1a37-4b4a-bce9-31afc91645a4-kube-api-access-9rg4l\") pod \"metrics-server-574cb6b97d-cmv9h\" (UID: \"1c4166b9-1a37-4b4a-bce9-31afc91645a4\") " pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:37.963007 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.962837 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c4166b9-1a37-4b4a-bce9-31afc91645a4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-574cb6b97d-cmv9h\" (UID: \"1c4166b9-1a37-4b4a-bce9-31afc91645a4\") " pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:37.963007 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.962861 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1c4166b9-1a37-4b4a-bce9-31afc91645a4-metrics-server-audit-profiles\") pod \"metrics-server-574cb6b97d-cmv9h\" (UID: \"1c4166b9-1a37-4b4a-bce9-31afc91645a4\") " pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:37.963007 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.962887 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4166b9-1a37-4b4a-bce9-31afc91645a4-client-ca-bundle\") pod \"metrics-server-574cb6b97d-cmv9h\" (UID: \"1c4166b9-1a37-4b4a-bce9-31afc91645a4\") " pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:37.963196 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.963165 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1c4166b9-1a37-4b4a-bce9-31afc91645a4-audit-log\") pod \"metrics-server-574cb6b97d-cmv9h\" (UID: \"1c4166b9-1a37-4b4a-bce9-31afc91645a4\") " pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:37.963582 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.963563 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c4166b9-1a37-4b4a-bce9-31afc91645a4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-574cb6b97d-cmv9h\" (UID: \"1c4166b9-1a37-4b4a-bce9-31afc91645a4\") " pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:37.964089 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.964066 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1c4166b9-1a37-4b4a-bce9-31afc91645a4-metrics-server-audit-profiles\") pod \"metrics-server-574cb6b97d-cmv9h\" (UID: \"1c4166b9-1a37-4b4a-bce9-31afc91645a4\") " pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:37.965322 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.965299 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1c4166b9-1a37-4b4a-bce9-31afc91645a4-secret-metrics-server-tls\") pod \"metrics-server-574cb6b97d-cmv9h\" (UID: \"1c4166b9-1a37-4b4a-bce9-31afc91645a4\") " pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:37.965387 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.965370 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4166b9-1a37-4b4a-bce9-31afc91645a4-client-ca-bundle\") pod \"metrics-server-574cb6b97d-cmv9h\" (UID: \"1c4166b9-1a37-4b4a-bce9-31afc91645a4\") " pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:37.965422 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.965388 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/1c4166b9-1a37-4b4a-bce9-31afc91645a4-secret-metrics-server-client-certs\") pod \"metrics-server-574cb6b97d-cmv9h\" (UID: \"1c4166b9-1a37-4b4a-bce9-31afc91645a4\") " pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:37.974591 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:37.974565 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rg4l\" (UniqueName: \"kubernetes.io/projected/1c4166b9-1a37-4b4a-bce9-31afc91645a4-kube-api-access-9rg4l\") pod \"metrics-server-574cb6b97d-cmv9h\" (UID: \"1c4166b9-1a37-4b4a-bce9-31afc91645a4\") " pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:38.082689 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:38.082611 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:38.207408 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:38.207374 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-574cb6b97d-cmv9h"] Apr 16 22:16:38.209907 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:16:38.209880 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c4166b9_1a37_4b4a_bce9_31afc91645a4.slice/crio-e4702fabc256c68e61ed0a1b5aa690caab06d9abb07aeddcfb6ad7a0181eef50 WatchSource:0}: Error finding container e4702fabc256c68e61ed0a1b5aa690caab06d9abb07aeddcfb6ad7a0181eef50: Status 404 returned error can't find the container with id e4702fabc256c68e61ed0a1b5aa690caab06d9abb07aeddcfb6ad7a0181eef50 Apr 16 22:16:38.989030 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:38.988990 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" event={"ID":"1c4166b9-1a37-4b4a-bce9-31afc91645a4","Type":"ContainerStarted","Data":"e4702fabc256c68e61ed0a1b5aa690caab06d9abb07aeddcfb6ad7a0181eef50"} Apr 16 22:16:39.992425 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:39.992389 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" event={"ID":"1c4166b9-1a37-4b4a-bce9-31afc91645a4","Type":"ContainerStarted","Data":"5e0531af122a57041987d33520a62a00e7578f90c55e7de75b8cda0dbe5886fb"} Apr 16 22:16:40.039305 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:40.039258 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" podStartSLOduration=1.751178104 podStartE2EDuration="3.039241808s" podCreationTimestamp="2026-04-16 22:16:37 +0000 UTC" firstStartedPulling="2026-04-16 22:16:38.211733965 +0000 UTC m=+185.454550004" lastFinishedPulling="2026-04-16 22:16:39.499797665 +0000 UTC m=+186.742613708" observedRunningTime="2026-04-16 22:16:40.038548629 +0000 UTC m=+187.281364690" watchObservedRunningTime="2026-04-16 22:16:40.039241808 +0000 UTC m=+187.282057871" Apr 16 22:16:41.410503 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.410469 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-57df64b7d7-6f86j"] Apr 16 22:16:41.413731 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.413712 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.416502 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.416482 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 22:16:41.416595 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.416531 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 22:16:41.417464 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.417447 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 22:16:41.417556 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.417515 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 22:16:41.417770 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.417753 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-qmtph\"" Apr 16 22:16:41.417847 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.417754 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 22:16:41.417847 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.417764 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 22:16:41.417847 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.417818 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 22:16:41.421785 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.421768 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 22:16:41.425654 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.425624 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57df64b7d7-6f86j"] Apr 16 22:16:41.490094 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.490067 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-service-ca\") pod \"console-57df64b7d7-6f86j\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.490240 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.490101 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk4gt\" (UniqueName: \"kubernetes.io/projected/b8c72e77-b912-4771-a685-91ef2573daa8-kube-api-access-jk4gt\") pod \"console-57df64b7d7-6f86j\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.490240 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.490123 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8c72e77-b912-4771-a685-91ef2573daa8-console-serving-cert\") pod \"console-57df64b7d7-6f86j\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.490240 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.490143 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-console-config\") pod \"console-57df64b7d7-6f86j\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.490396 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.490266 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-trusted-ca-bundle\") pod \"console-57df64b7d7-6f86j\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.490396 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.490302 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8c72e77-b912-4771-a685-91ef2573daa8-console-oauth-config\") pod \"console-57df64b7d7-6f86j\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.490396 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.490337 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-oauth-serving-cert\") pod \"console-57df64b7d7-6f86j\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.591210 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.591182 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-service-ca\") pod \"console-57df64b7d7-6f86j\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.591347 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.591218 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jk4gt\" (UniqueName: \"kubernetes.io/projected/b8c72e77-b912-4771-a685-91ef2573daa8-kube-api-access-jk4gt\") pod \"console-57df64b7d7-6f86j\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.591347 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.591241 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8c72e77-b912-4771-a685-91ef2573daa8-console-serving-cert\") pod \"console-57df64b7d7-6f86j\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.591462 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.591363 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-console-config\") pod \"console-57df64b7d7-6f86j\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.591515 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.591467 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-trusted-ca-bundle\") pod \"console-57df64b7d7-6f86j\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.591515 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.591498 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8c72e77-b912-4771-a685-91ef2573daa8-console-oauth-config\") pod \"console-57df64b7d7-6f86j\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.591605 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.591534 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-oauth-serving-cert\") pod \"console-57df64b7d7-6f86j\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.592020 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.591993 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-console-config\") pod \"console-57df64b7d7-6f86j\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.592020 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.592011 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-service-ca\") pod \"console-57df64b7d7-6f86j\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.592177 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.592161 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-oauth-serving-cert\") pod \"console-57df64b7d7-6f86j\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.592366 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.592332 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-trusted-ca-bundle\") pod \"console-57df64b7d7-6f86j\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.593809 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.593789 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8c72e77-b912-4771-a685-91ef2573daa8-console-serving-cert\") pod \"console-57df64b7d7-6f86j\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.594000 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.593982 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8c72e77-b912-4771-a685-91ef2573daa8-console-oauth-config\") pod \"console-57df64b7d7-6f86j\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.599998 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.599981 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk4gt\" (UniqueName: \"kubernetes.io/projected/b8c72e77-b912-4771-a685-91ef2573daa8-kube-api-access-jk4gt\") pod \"console-57df64b7d7-6f86j\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.722833 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.722747 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:41.841565 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:41.841535 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57df64b7d7-6f86j"] Apr 16 22:16:41.844285 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:16:41.844256 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8c72e77_b912_4771_a685_91ef2573daa8.slice/crio-ef911325031478bbe1c29e8af631851fb6ca0913680695349fa0001c3527fd85 WatchSource:0}: Error finding container ef911325031478bbe1c29e8af631851fb6ca0913680695349fa0001c3527fd85: Status 404 returned error can't find the container with id ef911325031478bbe1c29e8af631851fb6ca0913680695349fa0001c3527fd85 Apr 16 22:16:42.002405 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:42.002324 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57df64b7d7-6f86j" event={"ID":"b8c72e77-b912-4771-a685-91ef2573daa8","Type":"ContainerStarted","Data":"ef911325031478bbe1c29e8af631851fb6ca0913680695349fa0001c3527fd85"} Apr 16 22:16:45.012710 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:45.012674 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57df64b7d7-6f86j" event={"ID":"b8c72e77-b912-4771-a685-91ef2573daa8","Type":"ContainerStarted","Data":"e6c4a7c999ed006ce1cc03f42911e8efbed2d2b48c9ef6b2a347765d856e265b"} Apr 16 22:16:45.083376 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:45.083329 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-57df64b7d7-6f86j" podStartSLOduration=1.6069651280000001 podStartE2EDuration="4.083312361s" podCreationTimestamp="2026-04-16 22:16:41 +0000 UTC" firstStartedPulling="2026-04-16 22:16:41.846039545 +0000 UTC m=+189.088855584" lastFinishedPulling="2026-04-16 22:16:44.322386773 +0000 UTC m=+191.565202817" observedRunningTime="2026-04-16 22:16:45.081860061 +0000 UTC m=+192.324676123" watchObservedRunningTime="2026-04-16 22:16:45.083312361 +0000 UTC m=+192.326128422" Apr 16 22:16:47.953639 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:47.953609 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7ffcd7d997-ppg7h" Apr 16 22:16:51.722882 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:51.722846 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:51.722882 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:51.722889 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:51.727626 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:51.727603 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:52.035273 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:52.035199 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:16:58.083357 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:58.083321 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:16:58.083357 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:16:58.083364 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:17:10.391203 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:17:10.391162 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" podUID="cdea09b8-38c2-462e-949e-6de287fee8bf" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 22:17:18.088253 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:17:18.088220 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:17:18.092382 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:17:18.092358 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-574cb6b97d-cmv9h" Apr 16 22:17:20.391900 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:17:20.391861 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" podUID="cdea09b8-38c2-462e-949e-6de287fee8bf" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 22:17:30.391853 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:17:30.391813 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" podUID="cdea09b8-38c2-462e-949e-6de287fee8bf" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 22:17:30.392248 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:17:30.391900 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" Apr 16 22:17:30.392527 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:17:30.392494 2574 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"f1df5c95ee87c1f58b2f89a6efc8b0949049f809f30a5adcfdc3a2ff97b3d4c5"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 22:17:30.392568 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:17:30.392550 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" podUID="cdea09b8-38c2-462e-949e-6de287fee8bf" containerName="service-proxy" containerID="cri-o://f1df5c95ee87c1f58b2f89a6efc8b0949049f809f30a5adcfdc3a2ff97b3d4c5" gracePeriod=30 Apr 16 22:17:31.126496 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:17:31.126458 2574 generic.go:358] "Generic (PLEG): container finished" podID="cdea09b8-38c2-462e-949e-6de287fee8bf" containerID="f1df5c95ee87c1f58b2f89a6efc8b0949049f809f30a5adcfdc3a2ff97b3d4c5" exitCode=2 Apr 16 22:17:31.126686 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:17:31.126528 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" event={"ID":"cdea09b8-38c2-462e-949e-6de287fee8bf","Type":"ContainerDied","Data":"f1df5c95ee87c1f58b2f89a6efc8b0949049f809f30a5adcfdc3a2ff97b3d4c5"} Apr 16 22:17:31.126686 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:17:31.126563 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f888494b-rxfbz" event={"ID":"cdea09b8-38c2-462e-949e-6de287fee8bf","Type":"ContainerStarted","Data":"82b26de32ac36a1a523361c314d8357fd148c8ad8e902869784b7f95bf9833ff"} Apr 16 22:17:45.190800 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:17:45.190702 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs\") pod \"network-metrics-daemon-knqmk\" (UID: \"cc045530-7e0f-412e-98ba-915fe7aa6d22\") " pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:17:45.193037 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:17:45.193008 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc045530-7e0f-412e-98ba-915fe7aa6d22-metrics-certs\") pod \"network-metrics-daemon-knqmk\" (UID: \"cc045530-7e0f-412e-98ba-915fe7aa6d22\") " pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:17:45.347705 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:17:45.347668 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zhbqt\"" Apr 16 22:17:45.355002 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:17:45.354975 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knqmk" Apr 16 22:17:45.499191 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:17:45.499108 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-knqmk"] Apr 16 22:17:45.502553 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:17:45.502524 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc045530_7e0f_412e_98ba_915fe7aa6d22.slice/crio-ba70e05a1a86863ed099b18f6dd589fdec5b85769f2c67cf25c25a0ef6ea6b52 WatchSource:0}: Error finding container ba70e05a1a86863ed099b18f6dd589fdec5b85769f2c67cf25c25a0ef6ea6b52: Status 404 returned error can't find the container with id ba70e05a1a86863ed099b18f6dd589fdec5b85769f2c67cf25c25a0ef6ea6b52 Apr 16 22:17:46.173546 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:17:46.173506 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-knqmk" event={"ID":"cc045530-7e0f-412e-98ba-915fe7aa6d22","Type":"ContainerStarted","Data":"ba70e05a1a86863ed099b18f6dd589fdec5b85769f2c67cf25c25a0ef6ea6b52"} Apr 16 22:17:47.177772 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:17:47.177739 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-knqmk" event={"ID":"cc045530-7e0f-412e-98ba-915fe7aa6d22","Type":"ContainerStarted","Data":"16e5ca75fff26b292426155ca1fb4893dd71e6531e24aa493f59d833d0a29179"} Apr 16 22:17:47.177772 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:17:47.177772 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-knqmk" event={"ID":"cc045530-7e0f-412e-98ba-915fe7aa6d22","Type":"ContainerStarted","Data":"26ca2fe10ae3611c4aa9b5491104caa6178c947a88c8a9c505fab3c97dd7a3a4"} Apr 16 22:17:47.197344 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:17:47.197290 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-knqmk" podStartSLOduration=253.1896904 podStartE2EDuration="4m14.197275968s" podCreationTimestamp="2026-04-16 22:13:33 +0000 UTC" firstStartedPulling="2026-04-16 22:17:45.504500111 +0000 UTC m=+252.747316150" lastFinishedPulling="2026-04-16 22:17:46.512085676 +0000 UTC m=+253.754901718" observedRunningTime="2026-04-16 22:17:47.196241338 +0000 UTC m=+254.439057401" watchObservedRunningTime="2026-04-16 22:17:47.197275968 +0000 UTC m=+254.440092028" Apr 16 22:18:01.397441 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.397411 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-b9768bb58-nxjqk"] Apr 16 22:18:01.401987 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.401970 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.412914 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.412890 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b9768bb58-nxjqk"] Apr 16 22:18:01.532893 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.532846 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-trusted-ca-bundle\") pod \"console-b9768bb58-nxjqk\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.532893 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.532890 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-oauth-serving-cert\") pod \"console-b9768bb58-nxjqk\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.533140 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.532916 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/31192a8b-2476-4185-a8e4-90d4542c93b3-console-oauth-config\") pod \"console-b9768bb58-nxjqk\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.533140 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.532970 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-console-config\") pod \"console-b9768bb58-nxjqk\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.533140 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.532997 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6vhw\" (UniqueName: \"kubernetes.io/projected/31192a8b-2476-4185-a8e4-90d4542c93b3-kube-api-access-k6vhw\") pod \"console-b9768bb58-nxjqk\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.533140 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.533022 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-service-ca\") pod \"console-b9768bb58-nxjqk\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.533140 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.533058 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/31192a8b-2476-4185-a8e4-90d4542c93b3-console-serving-cert\") pod \"console-b9768bb58-nxjqk\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.633811 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.633779 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-trusted-ca-bundle\") pod \"console-b9768bb58-nxjqk\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.633811 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.633811 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-oauth-serving-cert\") pod \"console-b9768bb58-nxjqk\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.634009 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.633833 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/31192a8b-2476-4185-a8e4-90d4542c93b3-console-oauth-config\") pod \"console-b9768bb58-nxjqk\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.634009 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.633870 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-console-config\") pod \"console-b9768bb58-nxjqk\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.634009 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.633902 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k6vhw\" (UniqueName: \"kubernetes.io/projected/31192a8b-2476-4185-a8e4-90d4542c93b3-kube-api-access-k6vhw\") pod \"console-b9768bb58-nxjqk\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.634009 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.633961 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-service-ca\") pod \"console-b9768bb58-nxjqk\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.634185 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.634025 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/31192a8b-2476-4185-a8e4-90d4542c93b3-console-serving-cert\") pod \"console-b9768bb58-nxjqk\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.635112 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.635039 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-oauth-serving-cert\") pod \"console-b9768bb58-nxjqk\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.635112 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.635069 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-console-config\") pod \"console-b9768bb58-nxjqk\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.635112 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.635078 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-service-ca\") pod \"console-b9768bb58-nxjqk\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.636880 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.635474 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-trusted-ca-bundle\") pod \"console-b9768bb58-nxjqk\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.637253 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.637230 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/31192a8b-2476-4185-a8e4-90d4542c93b3-console-oauth-config\") pod \"console-b9768bb58-nxjqk\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.640849 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.640830 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/31192a8b-2476-4185-a8e4-90d4542c93b3-console-serving-cert\") pod \"console-b9768bb58-nxjqk\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.643034 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.643011 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6vhw\" (UniqueName: \"kubernetes.io/projected/31192a8b-2476-4185-a8e4-90d4542c93b3-kube-api-access-k6vhw\") pod \"console-b9768bb58-nxjqk\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.710544 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.710463 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:01.827330 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:01.827300 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b9768bb58-nxjqk"] Apr 16 22:18:01.830245 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:18:01.830212 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31192a8b_2476_4185_a8e4_90d4542c93b3.slice/crio-16da8d4fa6072731ea6e055e15ffd90a966d4de131504dc113300d2784654d9c WatchSource:0}: Error finding container 16da8d4fa6072731ea6e055e15ffd90a966d4de131504dc113300d2784654d9c: Status 404 returned error can't find the container with id 16da8d4fa6072731ea6e055e15ffd90a966d4de131504dc113300d2784654d9c Apr 16 22:18:02.220239 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:02.220202 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b9768bb58-nxjqk" event={"ID":"31192a8b-2476-4185-a8e4-90d4542c93b3","Type":"ContainerStarted","Data":"dae842f9ec21810ba754c40527ae8d8136d34317118233ca6fea344078a6b5c1"} Apr 16 22:18:02.220239 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:02.220244 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b9768bb58-nxjqk" event={"ID":"31192a8b-2476-4185-a8e4-90d4542c93b3","Type":"ContainerStarted","Data":"16da8d4fa6072731ea6e055e15ffd90a966d4de131504dc113300d2784654d9c"} Apr 16 22:18:02.238184 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:02.238134 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b9768bb58-nxjqk" podStartSLOduration=1.238118434 podStartE2EDuration="1.238118434s" podCreationTimestamp="2026-04-16 22:18:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:18:02.237061963 +0000 UTC m=+269.479878049" watchObservedRunningTime="2026-04-16 22:18:02.238118434 +0000 UTC m=+269.480934494" Apr 16 22:18:11.711362 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:11.711322 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:11.711798 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:11.711462 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:11.716182 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:11.716162 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:12.254665 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:12.254635 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:18:12.293668 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:12.293632 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57df64b7d7-6f86j"] Apr 16 22:18:33.319182 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:33.319144 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 22:18:33.321154 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:33.321130 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 22:18:33.324757 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:33.324733 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 22:18:37.312622 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.312558 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-57df64b7d7-6f86j" podUID="b8c72e77-b912-4771-a685-91ef2573daa8" containerName="console" containerID="cri-o://e6c4a7c999ed006ce1cc03f42911e8efbed2d2b48c9ef6b2a347765d856e265b" gracePeriod=15 Apr 16 22:18:37.548235 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.548205 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57df64b7d7-6f86j_b8c72e77-b912-4771-a685-91ef2573daa8/console/0.log" Apr 16 22:18:37.548337 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.548276 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:18:37.591659 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.591586 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-console-config\") pod \"b8c72e77-b912-4771-a685-91ef2573daa8\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " Apr 16 22:18:37.591659 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.591630 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk4gt\" (UniqueName: \"kubernetes.io/projected/b8c72e77-b912-4771-a685-91ef2573daa8-kube-api-access-jk4gt\") pod \"b8c72e77-b912-4771-a685-91ef2573daa8\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " Apr 16 22:18:37.591848 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.591661 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8c72e77-b912-4771-a685-91ef2573daa8-console-oauth-config\") pod \"b8c72e77-b912-4771-a685-91ef2573daa8\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " Apr 16 22:18:37.591848 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.591782 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-oauth-serving-cert\") pod \"b8c72e77-b912-4771-a685-91ef2573daa8\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " Apr 16 22:18:37.591848 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.591833 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-service-ca\") pod \"b8c72e77-b912-4771-a685-91ef2573daa8\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " Apr 16 22:18:37.592003 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.591979 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8c72e77-b912-4771-a685-91ef2573daa8-console-serving-cert\") pod \"b8c72e77-b912-4771-a685-91ef2573daa8\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " Apr 16 22:18:37.592073 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.592039 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-trusted-ca-bundle\") pod \"b8c72e77-b912-4771-a685-91ef2573daa8\" (UID: \"b8c72e77-b912-4771-a685-91ef2573daa8\") " Apr 16 22:18:37.592129 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.592097 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-console-config" (OuterVolumeSpecName: "console-config") pod "b8c72e77-b912-4771-a685-91ef2573daa8" (UID: "b8c72e77-b912-4771-a685-91ef2573daa8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:37.592184 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.592131 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-service-ca" (OuterVolumeSpecName: "service-ca") pod "b8c72e77-b912-4771-a685-91ef2573daa8" (UID: "b8c72e77-b912-4771-a685-91ef2573daa8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:37.592184 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.592155 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b8c72e77-b912-4771-a685-91ef2573daa8" (UID: "b8c72e77-b912-4771-a685-91ef2573daa8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:37.592384 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.592364 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-oauth-serving-cert\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:18:37.592482 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.592387 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-service-ca\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:18:37.592482 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.592400 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-console-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:18:37.592482 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.592403 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b8c72e77-b912-4771-a685-91ef2573daa8" (UID: "b8c72e77-b912-4771-a685-91ef2573daa8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:37.594137 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.594108 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8c72e77-b912-4771-a685-91ef2573daa8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b8c72e77-b912-4771-a685-91ef2573daa8" (UID: "b8c72e77-b912-4771-a685-91ef2573daa8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:18:37.594283 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.594263 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8c72e77-b912-4771-a685-91ef2573daa8-kube-api-access-jk4gt" (OuterVolumeSpecName: "kube-api-access-jk4gt") pod "b8c72e77-b912-4771-a685-91ef2573daa8" (UID: "b8c72e77-b912-4771-a685-91ef2573daa8"). InnerVolumeSpecName "kube-api-access-jk4gt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:18:37.594340 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.594315 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8c72e77-b912-4771-a685-91ef2573daa8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b8c72e77-b912-4771-a685-91ef2573daa8" (UID: "b8c72e77-b912-4771-a685-91ef2573daa8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:18:37.693559 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.693520 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8c72e77-b912-4771-a685-91ef2573daa8-console-serving-cert\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:18:37.693559 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.693551 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8c72e77-b912-4771-a685-91ef2573daa8-trusted-ca-bundle\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:18:37.693559 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.693561 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jk4gt\" (UniqueName: \"kubernetes.io/projected/b8c72e77-b912-4771-a685-91ef2573daa8-kube-api-access-jk4gt\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:18:37.693559 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:37.693571 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8c72e77-b912-4771-a685-91ef2573daa8-console-oauth-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:18:38.323214 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:38.323187 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57df64b7d7-6f86j_b8c72e77-b912-4771-a685-91ef2573daa8/console/0.log" Apr 16 22:18:38.323645 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:38.323225 2574 generic.go:358] "Generic (PLEG): container finished" podID="b8c72e77-b912-4771-a685-91ef2573daa8" containerID="e6c4a7c999ed006ce1cc03f42911e8efbed2d2b48c9ef6b2a347765d856e265b" exitCode=2 Apr 16 22:18:38.323645 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:38.323274 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57df64b7d7-6f86j" event={"ID":"b8c72e77-b912-4771-a685-91ef2573daa8","Type":"ContainerDied","Data":"e6c4a7c999ed006ce1cc03f42911e8efbed2d2b48c9ef6b2a347765d856e265b"} Apr 16 22:18:38.323645 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:38.323296 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57df64b7d7-6f86j" event={"ID":"b8c72e77-b912-4771-a685-91ef2573daa8","Type":"ContainerDied","Data":"ef911325031478bbe1c29e8af631851fb6ca0913680695349fa0001c3527fd85"} Apr 16 22:18:38.323645 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:38.323311 2574 scope.go:117] "RemoveContainer" containerID="e6c4a7c999ed006ce1cc03f42911e8efbed2d2b48c9ef6b2a347765d856e265b" Apr 16 22:18:38.323645 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:38.323312 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57df64b7d7-6f86j" Apr 16 22:18:38.331217 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:38.331200 2574 scope.go:117] "RemoveContainer" containerID="e6c4a7c999ed006ce1cc03f42911e8efbed2d2b48c9ef6b2a347765d856e265b" Apr 16 22:18:38.331490 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:18:38.331464 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6c4a7c999ed006ce1cc03f42911e8efbed2d2b48c9ef6b2a347765d856e265b\": container with ID starting with e6c4a7c999ed006ce1cc03f42911e8efbed2d2b48c9ef6b2a347765d856e265b not found: ID does not exist" containerID="e6c4a7c999ed006ce1cc03f42911e8efbed2d2b48c9ef6b2a347765d856e265b" Apr 16 22:18:38.331546 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:38.331493 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6c4a7c999ed006ce1cc03f42911e8efbed2d2b48c9ef6b2a347765d856e265b"} err="failed to get container status \"e6c4a7c999ed006ce1cc03f42911e8efbed2d2b48c9ef6b2a347765d856e265b\": rpc error: code = NotFound desc = could not find container \"e6c4a7c999ed006ce1cc03f42911e8efbed2d2b48c9ef6b2a347765d856e265b\": container with ID starting with e6c4a7c999ed006ce1cc03f42911e8efbed2d2b48c9ef6b2a347765d856e265b not found: ID does not exist" Apr 16 22:18:38.342850 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:38.342829 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57df64b7d7-6f86j"] Apr 16 22:18:38.348140 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:38.348118 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-57df64b7d7-6f86j"] Apr 16 22:18:39.447655 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:18:39.447618 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8c72e77-b912-4771-a685-91ef2573daa8" path="/var/lib/kubelet/pods/b8c72e77-b912-4771-a685-91ef2573daa8/volumes" Apr 16 22:19:10.166448 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.166418 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f75776864-449jw"] Apr 16 22:19:10.166879 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.166655 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8c72e77-b912-4771-a685-91ef2573daa8" containerName="console" Apr 16 22:19:10.166879 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.166666 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8c72e77-b912-4771-a685-91ef2573daa8" containerName="console" Apr 16 22:19:10.166879 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.166708 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8c72e77-b912-4771-a685-91ef2573daa8" containerName="console" Apr 16 22:19:10.169502 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.169486 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.189123 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.189087 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f75776864-449jw"] Apr 16 22:19:10.225537 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.225501 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-trusted-ca-bundle\") pod \"console-6f75776864-449jw\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.225723 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.225542 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a3f46cc-780b-4bc5-8dc9-793c584581df-console-serving-cert\") pod \"console-6f75776864-449jw\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.225723 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.225566 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-oauth-serving-cert\") pod \"console-6f75776864-449jw\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.225723 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.225608 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-console-config\") pod \"console-6f75776864-449jw\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.225723 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.225658 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a3f46cc-780b-4bc5-8dc9-793c584581df-console-oauth-config\") pod \"console-6f75776864-449jw\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.225723 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.225703 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-service-ca\") pod \"console-6f75776864-449jw\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.225723 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.225721 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpfqb\" (UniqueName: \"kubernetes.io/projected/7a3f46cc-780b-4bc5-8dc9-793c584581df-kube-api-access-vpfqb\") pod \"console-6f75776864-449jw\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.326821 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.326786 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-console-config\") pod \"console-6f75776864-449jw\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.326821 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.326826 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a3f46cc-780b-4bc5-8dc9-793c584581df-console-oauth-config\") pod \"console-6f75776864-449jw\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.327099 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.326853 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-service-ca\") pod \"console-6f75776864-449jw\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.327099 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.326979 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpfqb\" (UniqueName: \"kubernetes.io/projected/7a3f46cc-780b-4bc5-8dc9-793c584581df-kube-api-access-vpfqb\") pod \"console-6f75776864-449jw\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.327099 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.327057 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-trusted-ca-bundle\") pod \"console-6f75776864-449jw\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.327099 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.327098 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a3f46cc-780b-4bc5-8dc9-793c584581df-console-serving-cert\") pod \"console-6f75776864-449jw\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.327272 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.327120 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-oauth-serving-cert\") pod \"console-6f75776864-449jw\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.328045 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.328012 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-console-config\") pod \"console-6f75776864-449jw\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.328206 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.328037 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-service-ca\") pod \"console-6f75776864-449jw\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.330960 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.328413 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-oauth-serving-cert\") pod \"console-6f75776864-449jw\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.330960 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.329346 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-trusted-ca-bundle\") pod \"console-6f75776864-449jw\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.330960 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.330254 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a3f46cc-780b-4bc5-8dc9-793c584581df-console-oauth-config\") pod \"console-6f75776864-449jw\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.330960 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.330364 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a3f46cc-780b-4bc5-8dc9-793c584581df-console-serving-cert\") pod \"console-6f75776864-449jw\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.337349 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.337321 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpfqb\" (UniqueName: \"kubernetes.io/projected/7a3f46cc-780b-4bc5-8dc9-793c584581df-kube-api-access-vpfqb\") pod \"console-6f75776864-449jw\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.478049 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.477970 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:10.594853 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.594822 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f75776864-449jw"] Apr 16 22:19:10.597766 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:19:10.597738 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a3f46cc_780b_4bc5_8dc9_793c584581df.slice/crio-dfb92671ab9332cdb4366c5c7bc53dedf373c808869294342275ea568b190e4d WatchSource:0}: Error finding container dfb92671ab9332cdb4366c5c7bc53dedf373c808869294342275ea568b190e4d: Status 404 returned error can't find the container with id dfb92671ab9332cdb4366c5c7bc53dedf373c808869294342275ea568b190e4d Apr 16 22:19:10.599482 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:10.599467 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:19:11.412501 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:11.412416 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f75776864-449jw" event={"ID":"7a3f46cc-780b-4bc5-8dc9-793c584581df","Type":"ContainerStarted","Data":"525c94e4a55f39018309ba782cb559341682705474e29d3854e7b1cd1b6d3f11"} Apr 16 22:19:11.412501 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:11.412454 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f75776864-449jw" event={"ID":"7a3f46cc-780b-4bc5-8dc9-793c584581df","Type":"ContainerStarted","Data":"dfb92671ab9332cdb4366c5c7bc53dedf373c808869294342275ea568b190e4d"} Apr 16 22:19:11.430569 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:11.430528 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f75776864-449jw" podStartSLOduration=1.430514466 podStartE2EDuration="1.430514466s" podCreationTimestamp="2026-04-16 22:19:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:19:11.428769614 +0000 UTC m=+338.671585675" watchObservedRunningTime="2026-04-16 22:19:11.430514466 +0000 UTC m=+338.673330528" Apr 16 22:19:20.478310 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:20.478276 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:20.478310 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:20.478313 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:20.483197 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:20.483175 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:21.447830 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:21.447804 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6f75776864-449jw" Apr 16 22:19:21.498409 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:21.498379 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b9768bb58-nxjqk"] Apr 16 22:19:46.519390 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.519327 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-b9768bb58-nxjqk" podUID="31192a8b-2476-4185-a8e4-90d4542c93b3" containerName="console" containerID="cri-o://dae842f9ec21810ba754c40527ae8d8136d34317118233ca6fea344078a6b5c1" gracePeriod=15 Apr 16 22:19:46.749851 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.749828 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b9768bb58-nxjqk_31192a8b-2476-4185-a8e4-90d4542c93b3/console/0.log" Apr 16 22:19:46.749982 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.749889 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:19:46.806487 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.806409 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/31192a8b-2476-4185-a8e4-90d4542c93b3-console-oauth-config\") pod \"31192a8b-2476-4185-a8e4-90d4542c93b3\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " Apr 16 22:19:46.806487 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.806470 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-oauth-serving-cert\") pod \"31192a8b-2476-4185-a8e4-90d4542c93b3\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " Apr 16 22:19:46.806678 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.806496 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6vhw\" (UniqueName: \"kubernetes.io/projected/31192a8b-2476-4185-a8e4-90d4542c93b3-kube-api-access-k6vhw\") pod \"31192a8b-2476-4185-a8e4-90d4542c93b3\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " Apr 16 22:19:46.806678 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.806522 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-trusted-ca-bundle\") pod \"31192a8b-2476-4185-a8e4-90d4542c93b3\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " Apr 16 22:19:46.806678 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.806551 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-service-ca\") pod \"31192a8b-2476-4185-a8e4-90d4542c93b3\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " Apr 16 22:19:46.806678 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.806591 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-console-config\") pod \"31192a8b-2476-4185-a8e4-90d4542c93b3\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " Apr 16 22:19:46.806678 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.806614 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/31192a8b-2476-4185-a8e4-90d4542c93b3-console-serving-cert\") pod \"31192a8b-2476-4185-a8e4-90d4542c93b3\" (UID: \"31192a8b-2476-4185-a8e4-90d4542c93b3\") " Apr 16 22:19:46.806987 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.806957 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "31192a8b-2476-4185-a8e4-90d4542c93b3" (UID: "31192a8b-2476-4185-a8e4-90d4542c93b3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:19:46.807074 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.806983 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "31192a8b-2476-4185-a8e4-90d4542c93b3" (UID: "31192a8b-2476-4185-a8e4-90d4542c93b3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:19:46.807074 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.807026 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-console-config" (OuterVolumeSpecName: "console-config") pod "31192a8b-2476-4185-a8e4-90d4542c93b3" (UID: "31192a8b-2476-4185-a8e4-90d4542c93b3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:19:46.807074 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.807029 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-service-ca" (OuterVolumeSpecName: "service-ca") pod "31192a8b-2476-4185-a8e4-90d4542c93b3" (UID: "31192a8b-2476-4185-a8e4-90d4542c93b3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:19:46.808700 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.808680 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31192a8b-2476-4185-a8e4-90d4542c93b3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "31192a8b-2476-4185-a8e4-90d4542c93b3" (UID: "31192a8b-2476-4185-a8e4-90d4542c93b3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:19:46.809029 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.809006 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31192a8b-2476-4185-a8e4-90d4542c93b3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "31192a8b-2476-4185-a8e4-90d4542c93b3" (UID: "31192a8b-2476-4185-a8e4-90d4542c93b3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:19:46.809108 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.809031 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31192a8b-2476-4185-a8e4-90d4542c93b3-kube-api-access-k6vhw" (OuterVolumeSpecName: "kube-api-access-k6vhw") pod "31192a8b-2476-4185-a8e4-90d4542c93b3" (UID: "31192a8b-2476-4185-a8e4-90d4542c93b3"). InnerVolumeSpecName "kube-api-access-k6vhw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:19:46.907424 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.907384 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-console-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:19:46.907424 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.907419 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/31192a8b-2476-4185-a8e4-90d4542c93b3-console-serving-cert\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:19:46.907424 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.907430 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/31192a8b-2476-4185-a8e4-90d4542c93b3-console-oauth-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:19:46.907424 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.907439 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-oauth-serving-cert\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:19:46.907663 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.907448 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k6vhw\" (UniqueName: \"kubernetes.io/projected/31192a8b-2476-4185-a8e4-90d4542c93b3-kube-api-access-k6vhw\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:19:46.907663 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.907457 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-trusted-ca-bundle\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:19:46.907663 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:46.907466 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31192a8b-2476-4185-a8e4-90d4542c93b3-service-ca\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:19:47.505417 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:47.505390 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b9768bb58-nxjqk_31192a8b-2476-4185-a8e4-90d4542c93b3/console/0.log" Apr 16 22:19:47.505555 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:47.505428 2574 generic.go:358] "Generic (PLEG): container finished" podID="31192a8b-2476-4185-a8e4-90d4542c93b3" containerID="dae842f9ec21810ba754c40527ae8d8136d34317118233ca6fea344078a6b5c1" exitCode=2 Apr 16 22:19:47.505555 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:47.505461 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b9768bb58-nxjqk" event={"ID":"31192a8b-2476-4185-a8e4-90d4542c93b3","Type":"ContainerDied","Data":"dae842f9ec21810ba754c40527ae8d8136d34317118233ca6fea344078a6b5c1"} Apr 16 22:19:47.505555 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:47.505481 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b9768bb58-nxjqk" event={"ID":"31192a8b-2476-4185-a8e4-90d4542c93b3","Type":"ContainerDied","Data":"16da8d4fa6072731ea6e055e15ffd90a966d4de131504dc113300d2784654d9c"} Apr 16 22:19:47.505555 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:47.505495 2574 scope.go:117] "RemoveContainer" containerID="dae842f9ec21810ba754c40527ae8d8136d34317118233ca6fea344078a6b5c1" Apr 16 22:19:47.505555 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:47.505491 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b9768bb58-nxjqk" Apr 16 22:19:47.513078 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:47.513043 2574 scope.go:117] "RemoveContainer" containerID="dae842f9ec21810ba754c40527ae8d8136d34317118233ca6fea344078a6b5c1" Apr 16 22:19:47.513319 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:19:47.513299 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dae842f9ec21810ba754c40527ae8d8136d34317118233ca6fea344078a6b5c1\": container with ID starting with dae842f9ec21810ba754c40527ae8d8136d34317118233ca6fea344078a6b5c1 not found: ID does not exist" containerID="dae842f9ec21810ba754c40527ae8d8136d34317118233ca6fea344078a6b5c1" Apr 16 22:19:47.513393 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:47.513325 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dae842f9ec21810ba754c40527ae8d8136d34317118233ca6fea344078a6b5c1"} err="failed to get container status \"dae842f9ec21810ba754c40527ae8d8136d34317118233ca6fea344078a6b5c1\": rpc error: code = NotFound desc = could not find container \"dae842f9ec21810ba754c40527ae8d8136d34317118233ca6fea344078a6b5c1\": container with ID starting with dae842f9ec21810ba754c40527ae8d8136d34317118233ca6fea344078a6b5c1 not found: ID does not exist" Apr 16 22:19:47.522000 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:47.521978 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b9768bb58-nxjqk"] Apr 16 22:19:47.525947 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:47.525915 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-b9768bb58-nxjqk"] Apr 16 22:19:49.447673 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:19:49.447636 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31192a8b-2476-4185-a8e4-90d4542c93b3" path="/var/lib/kubelet/pods/31192a8b-2476-4185-a8e4-90d4542c93b3/volumes" Apr 16 22:20:18.098302 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:18.098272 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq"] Apr 16 22:20:18.098764 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:18.098526 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31192a8b-2476-4185-a8e4-90d4542c93b3" containerName="console" Apr 16 22:20:18.098764 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:18.098540 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="31192a8b-2476-4185-a8e4-90d4542c93b3" containerName="console" Apr 16 22:20:18.098764 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:18.098581 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="31192a8b-2476-4185-a8e4-90d4542c93b3" containerName="console" Apr 16 22:20:18.101293 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:18.101274 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq" Apr 16 22:20:18.103782 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:18.103761 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 22:20:18.103893 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:18.103815 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-mrwkv\"" Apr 16 22:20:18.104392 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:18.104367 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 22:20:18.109611 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:18.109592 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq"] Apr 16 22:20:18.227632 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:18.227602 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpbjl\" (UniqueName: \"kubernetes.io/projected/016242b2-f839-41be-8ef1-0e5b2d45c445-kube-api-access-cpbjl\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq\" (UID: \"016242b2-f839-41be-8ef1-0e5b2d45c445\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq" Apr 16 22:20:18.227780 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:18.227660 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/016242b2-f839-41be-8ef1-0e5b2d45c445-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq\" (UID: \"016242b2-f839-41be-8ef1-0e5b2d45c445\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq" Apr 16 22:20:18.227780 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:18.227681 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/016242b2-f839-41be-8ef1-0e5b2d45c445-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq\" (UID: \"016242b2-f839-41be-8ef1-0e5b2d45c445\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq" Apr 16 22:20:18.328846 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:18.328800 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/016242b2-f839-41be-8ef1-0e5b2d45c445-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq\" (UID: \"016242b2-f839-41be-8ef1-0e5b2d45c445\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq" Apr 16 22:20:18.328846 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:18.328851 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/016242b2-f839-41be-8ef1-0e5b2d45c445-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq\" (UID: \"016242b2-f839-41be-8ef1-0e5b2d45c445\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq" Apr 16 22:20:18.329021 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:18.328893 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpbjl\" (UniqueName: \"kubernetes.io/projected/016242b2-f839-41be-8ef1-0e5b2d45c445-kube-api-access-cpbjl\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq\" (UID: \"016242b2-f839-41be-8ef1-0e5b2d45c445\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq" Apr 16 22:20:18.329187 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:18.329169 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/016242b2-f839-41be-8ef1-0e5b2d45c445-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq\" (UID: \"016242b2-f839-41be-8ef1-0e5b2d45c445\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq" Apr 16 22:20:18.329252 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:18.329233 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/016242b2-f839-41be-8ef1-0e5b2d45c445-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq\" (UID: \"016242b2-f839-41be-8ef1-0e5b2d45c445\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq" Apr 16 22:20:18.340558 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:18.340535 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpbjl\" (UniqueName: \"kubernetes.io/projected/016242b2-f839-41be-8ef1-0e5b2d45c445-kube-api-access-cpbjl\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq\" (UID: \"016242b2-f839-41be-8ef1-0e5b2d45c445\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq" Apr 16 22:20:18.410181 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:18.410156 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq" Apr 16 22:20:18.530003 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:18.529971 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq"] Apr 16 22:20:18.533024 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:20:18.532997 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod016242b2_f839_41be_8ef1_0e5b2d45c445.slice/crio-c50755ee1519e00e289c405dcf9311c2f8762ed298123343fd37c2147d03aad7 WatchSource:0}: Error finding container c50755ee1519e00e289c405dcf9311c2f8762ed298123343fd37c2147d03aad7: Status 404 returned error can't find the container with id c50755ee1519e00e289c405dcf9311c2f8762ed298123343fd37c2147d03aad7 Apr 16 22:20:18.584241 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:18.584192 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq" event={"ID":"016242b2-f839-41be-8ef1-0e5b2d45c445","Type":"ContainerStarted","Data":"c50755ee1519e00e289c405dcf9311c2f8762ed298123343fd37c2147d03aad7"} Apr 16 22:20:24.602112 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:24.602039 2574 generic.go:358] "Generic (PLEG): container finished" podID="016242b2-f839-41be-8ef1-0e5b2d45c445" containerID="21b434eef728ac1c31bc661e73c5d0c4028245b7550c7740f361dd9f997d28e6" exitCode=0 Apr 16 22:20:24.602421 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:24.602120 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq" event={"ID":"016242b2-f839-41be-8ef1-0e5b2d45c445","Type":"ContainerDied","Data":"21b434eef728ac1c31bc661e73c5d0c4028245b7550c7740f361dd9f997d28e6"} Apr 16 22:20:27.612221 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:27.612187 2574 generic.go:358] "Generic (PLEG): container finished" podID="016242b2-f839-41be-8ef1-0e5b2d45c445" containerID="df554f29a388d935c778572ab1503e3187605a8a68d6c06b8ff9f2afb9a7274b" exitCode=0 Apr 16 22:20:27.612600 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:27.612275 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq" event={"ID":"016242b2-f839-41be-8ef1-0e5b2d45c445","Type":"ContainerDied","Data":"df554f29a388d935c778572ab1503e3187605a8a68d6c06b8ff9f2afb9a7274b"} Apr 16 22:20:34.634068 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:34.634028 2574 generic.go:358] "Generic (PLEG): container finished" podID="016242b2-f839-41be-8ef1-0e5b2d45c445" containerID="8e93ef316be192352ba3109f99ac1652c3b3398af3ce4586490abeb3a14c6d48" exitCode=0 Apr 16 22:20:34.634440 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:34.634115 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq" event={"ID":"016242b2-f839-41be-8ef1-0e5b2d45c445","Type":"ContainerDied","Data":"8e93ef316be192352ba3109f99ac1652c3b3398af3ce4586490abeb3a14c6d48"} Apr 16 22:20:35.754480 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:35.754461 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq" Apr 16 22:20:35.874314 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:35.874278 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpbjl\" (UniqueName: \"kubernetes.io/projected/016242b2-f839-41be-8ef1-0e5b2d45c445-kube-api-access-cpbjl\") pod \"016242b2-f839-41be-8ef1-0e5b2d45c445\" (UID: \"016242b2-f839-41be-8ef1-0e5b2d45c445\") " Apr 16 22:20:35.874480 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:35.874373 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/016242b2-f839-41be-8ef1-0e5b2d45c445-util\") pod \"016242b2-f839-41be-8ef1-0e5b2d45c445\" (UID: \"016242b2-f839-41be-8ef1-0e5b2d45c445\") " Apr 16 22:20:35.874480 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:35.874436 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/016242b2-f839-41be-8ef1-0e5b2d45c445-bundle\") pod \"016242b2-f839-41be-8ef1-0e5b2d45c445\" (UID: \"016242b2-f839-41be-8ef1-0e5b2d45c445\") " Apr 16 22:20:35.874981 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:35.874955 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/016242b2-f839-41be-8ef1-0e5b2d45c445-bundle" (OuterVolumeSpecName: "bundle") pod "016242b2-f839-41be-8ef1-0e5b2d45c445" (UID: "016242b2-f839-41be-8ef1-0e5b2d45c445"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:20:35.876390 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:35.876371 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/016242b2-f839-41be-8ef1-0e5b2d45c445-kube-api-access-cpbjl" (OuterVolumeSpecName: "kube-api-access-cpbjl") pod "016242b2-f839-41be-8ef1-0e5b2d45c445" (UID: "016242b2-f839-41be-8ef1-0e5b2d45c445"). InnerVolumeSpecName "kube-api-access-cpbjl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:20:35.878867 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:35.878845 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/016242b2-f839-41be-8ef1-0e5b2d45c445-util" (OuterVolumeSpecName: "util") pod "016242b2-f839-41be-8ef1-0e5b2d45c445" (UID: "016242b2-f839-41be-8ef1-0e5b2d45c445"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:20:35.975472 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:35.975397 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/016242b2-f839-41be-8ef1-0e5b2d45c445-util\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:20:35.975472 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:35.975423 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/016242b2-f839-41be-8ef1-0e5b2d45c445-bundle\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:20:35.975472 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:35.975432 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cpbjl\" (UniqueName: \"kubernetes.io/projected/016242b2-f839-41be-8ef1-0e5b2d45c445-kube-api-access-cpbjl\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:20:36.641234 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:36.641192 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq" event={"ID":"016242b2-f839-41be-8ef1-0e5b2d45c445","Type":"ContainerDied","Data":"c50755ee1519e00e289c405dcf9311c2f8762ed298123343fd37c2147d03aad7"} Apr 16 22:20:36.641234 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:36.641230 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c50755ee1519e00e289c405dcf9311c2f8762ed298123343fd37c2147d03aad7" Apr 16 22:20:36.641234 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:20:36.641214 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czrwtq" Apr 16 22:21:51.166162 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.166130 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-gmtz6"] Apr 16 22:21:51.166622 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.166384 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="016242b2-f839-41be-8ef1-0e5b2d45c445" containerName="extract" Apr 16 22:21:51.166622 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.166397 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="016242b2-f839-41be-8ef1-0e5b2d45c445" containerName="extract" Apr 16 22:21:51.166622 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.166415 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="016242b2-f839-41be-8ef1-0e5b2d45c445" containerName="util" Apr 16 22:21:51.166622 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.166420 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="016242b2-f839-41be-8ef1-0e5b2d45c445" containerName="util" Apr 16 22:21:51.166622 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.166426 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="016242b2-f839-41be-8ef1-0e5b2d45c445" containerName="pull" Apr 16 22:21:51.166622 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.166432 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="016242b2-f839-41be-8ef1-0e5b2d45c445" containerName="pull" Apr 16 22:21:51.166622 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.166474 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="016242b2-f839-41be-8ef1-0e5b2d45c445" containerName="extract" Apr 16 22:21:51.168234 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.168208 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-gmtz6" Apr 16 22:21:51.171994 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.171968 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-jfh87\"" Apr 16 22:21:51.172148 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.172092 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 22:21:51.172148 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.172108 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 22:21:51.172259 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.172184 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 22:21:51.173298 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.173278 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-5blj5"] Apr 16 22:21:51.175180 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.175164 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-5blj5" Apr 16 22:21:51.177254 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.177238 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 22:21:51.177411 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.177394 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-c8rdr\"" Apr 16 22:21:51.180555 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.180533 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-gmtz6"] Apr 16 22:21:51.184882 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.184865 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-5blj5"] Apr 16 22:21:51.303518 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.303486 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f27jt\" (UniqueName: \"kubernetes.io/projected/2242979c-1d41-449b-a586-be0a382752c9-kube-api-access-f27jt\") pod \"kserve-controller-manager-84d7d5cfc6-gmtz6\" (UID: \"2242979c-1d41-449b-a586-be0a382752c9\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-gmtz6" Apr 16 22:21:51.303696 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.303558 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmvlg\" (UniqueName: \"kubernetes.io/projected/cb1efc1b-240a-4c14-ac61-623284c6e9f7-kube-api-access-pmvlg\") pod \"llmisvc-controller-manager-68cc5db7c4-5blj5\" (UID: \"cb1efc1b-240a-4c14-ac61-623284c6e9f7\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-5blj5" Apr 16 22:21:51.303696 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.303583 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2242979c-1d41-449b-a586-be0a382752c9-cert\") pod \"kserve-controller-manager-84d7d5cfc6-gmtz6\" (UID: \"2242979c-1d41-449b-a586-be0a382752c9\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-gmtz6" Apr 16 22:21:51.303696 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.303623 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb1efc1b-240a-4c14-ac61-623284c6e9f7-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-5blj5\" (UID: \"cb1efc1b-240a-4c14-ac61-623284c6e9f7\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-5blj5" Apr 16 22:21:51.404775 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.404730 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb1efc1b-240a-4c14-ac61-623284c6e9f7-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-5blj5\" (UID: \"cb1efc1b-240a-4c14-ac61-623284c6e9f7\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-5blj5" Apr 16 22:21:51.404775 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.404783 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f27jt\" (UniqueName: \"kubernetes.io/projected/2242979c-1d41-449b-a586-be0a382752c9-kube-api-access-f27jt\") pod \"kserve-controller-manager-84d7d5cfc6-gmtz6\" (UID: \"2242979c-1d41-449b-a586-be0a382752c9\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-gmtz6" Apr 16 22:21:51.405024 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.404835 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmvlg\" (UniqueName: \"kubernetes.io/projected/cb1efc1b-240a-4c14-ac61-623284c6e9f7-kube-api-access-pmvlg\") pod \"llmisvc-controller-manager-68cc5db7c4-5blj5\" (UID: \"cb1efc1b-240a-4c14-ac61-623284c6e9f7\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-5blj5" Apr 16 22:21:51.405024 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.404854 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2242979c-1d41-449b-a586-be0a382752c9-cert\") pod \"kserve-controller-manager-84d7d5cfc6-gmtz6\" (UID: \"2242979c-1d41-449b-a586-be0a382752c9\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-gmtz6" Apr 16 22:21:51.405024 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:21:51.404898 2574 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 16 22:21:51.405024 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:21:51.404997 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb1efc1b-240a-4c14-ac61-623284c6e9f7-cert podName:cb1efc1b-240a-4c14-ac61-623284c6e9f7 nodeName:}" failed. No retries permitted until 2026-04-16 22:21:51.904976078 +0000 UTC m=+499.147792116 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cb1efc1b-240a-4c14-ac61-623284c6e9f7-cert") pod "llmisvc-controller-manager-68cc5db7c4-5blj5" (UID: "cb1efc1b-240a-4c14-ac61-623284c6e9f7") : secret "llmisvc-webhook-server-cert" not found Apr 16 22:21:51.409921 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.408344 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2242979c-1d41-449b-a586-be0a382752c9-cert\") pod \"kserve-controller-manager-84d7d5cfc6-gmtz6\" (UID: \"2242979c-1d41-449b-a586-be0a382752c9\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-gmtz6" Apr 16 22:21:51.414025 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.414002 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f27jt\" (UniqueName: \"kubernetes.io/projected/2242979c-1d41-449b-a586-be0a382752c9-kube-api-access-f27jt\") pod \"kserve-controller-manager-84d7d5cfc6-gmtz6\" (UID: \"2242979c-1d41-449b-a586-be0a382752c9\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-gmtz6" Apr 16 22:21:51.414129 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.414104 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmvlg\" (UniqueName: \"kubernetes.io/projected/cb1efc1b-240a-4c14-ac61-623284c6e9f7-kube-api-access-pmvlg\") pod \"llmisvc-controller-manager-68cc5db7c4-5blj5\" (UID: \"cb1efc1b-240a-4c14-ac61-623284c6e9f7\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-5blj5" Apr 16 22:21:51.478843 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.478760 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-gmtz6" Apr 16 22:21:51.595605 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.595570 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-gmtz6"] Apr 16 22:21:51.598659 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:21:51.598630 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2242979c_1d41_449b_a586_be0a382752c9.slice/crio-e11e744f3bea48cd9589ffc6caef187eaf53cd79e6b98cef1a0a218e8fbbf318 WatchSource:0}: Error finding container e11e744f3bea48cd9589ffc6caef187eaf53cd79e6b98cef1a0a218e8fbbf318: Status 404 returned error can't find the container with id e11e744f3bea48cd9589ffc6caef187eaf53cd79e6b98cef1a0a218e8fbbf318 Apr 16 22:21:51.835866 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.835778 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-gmtz6" event={"ID":"2242979c-1d41-449b-a586-be0a382752c9","Type":"ContainerStarted","Data":"e11e744f3bea48cd9589ffc6caef187eaf53cd79e6b98cef1a0a218e8fbbf318"} Apr 16 22:21:51.909346 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.909309 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb1efc1b-240a-4c14-ac61-623284c6e9f7-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-5blj5\" (UID: \"cb1efc1b-240a-4c14-ac61-623284c6e9f7\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-5blj5" Apr 16 22:21:51.911615 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:51.911596 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb1efc1b-240a-4c14-ac61-623284c6e9f7-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-5blj5\" (UID: \"cb1efc1b-240a-4c14-ac61-623284c6e9f7\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-5blj5" Apr 16 22:21:52.086672 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:52.086596 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-5blj5" Apr 16 22:21:52.229011 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:52.228956 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-5blj5"] Apr 16 22:21:52.232785 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:21:52.232751 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcb1efc1b_240a_4c14_ac61_623284c6e9f7.slice/crio-906cc7fd1127fd93948ec4d09c176e0ac84e8791b05aefceeb603fac141ff3ab WatchSource:0}: Error finding container 906cc7fd1127fd93948ec4d09c176e0ac84e8791b05aefceeb603fac141ff3ab: Status 404 returned error can't find the container with id 906cc7fd1127fd93948ec4d09c176e0ac84e8791b05aefceeb603fac141ff3ab Apr 16 22:21:52.839238 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:52.839197 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-5blj5" event={"ID":"cb1efc1b-240a-4c14-ac61-623284c6e9f7","Type":"ContainerStarted","Data":"906cc7fd1127fd93948ec4d09c176e0ac84e8791b05aefceeb603fac141ff3ab"} Apr 16 22:21:55.849657 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:55.849623 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-5blj5" event={"ID":"cb1efc1b-240a-4c14-ac61-623284c6e9f7","Type":"ContainerStarted","Data":"dbcaf59825c5b24a18d3c8d640d5de0877ec69c5a0a95ba5e4540c2fbe002851"} Apr 16 22:21:55.850122 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:55.849720 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-5blj5" Apr 16 22:21:55.850953 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:55.850906 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-gmtz6" event={"ID":"2242979c-1d41-449b-a586-be0a382752c9","Type":"ContainerStarted","Data":"3c96a2f3d05cb93be34e4d78154c6765a2eb360bb3f6d5eca5174a416dfd01f4"} Apr 16 22:21:55.851086 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:55.851071 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84d7d5cfc6-gmtz6" Apr 16 22:21:55.868117 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:55.868070 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-5blj5" podStartSLOduration=2.0390362890000002 podStartE2EDuration="4.868057774s" podCreationTimestamp="2026-04-16 22:21:51 +0000 UTC" firstStartedPulling="2026-04-16 22:21:52.234759593 +0000 UTC m=+499.477575635" lastFinishedPulling="2026-04-16 22:21:55.063781078 +0000 UTC m=+502.306597120" observedRunningTime="2026-04-16 22:21:55.865526973 +0000 UTC m=+503.108343056" watchObservedRunningTime="2026-04-16 22:21:55.868057774 +0000 UTC m=+503.110873835" Apr 16 22:21:55.882440 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:21:55.882404 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84d7d5cfc6-gmtz6" podStartSLOduration=2.119271566 podStartE2EDuration="4.882389998s" podCreationTimestamp="2026-04-16 22:21:51 +0000 UTC" firstStartedPulling="2026-04-16 22:21:51.60023205 +0000 UTC m=+498.843048088" lastFinishedPulling="2026-04-16 22:21:54.363350481 +0000 UTC m=+501.606166520" observedRunningTime="2026-04-16 22:21:55.881740881 +0000 UTC m=+503.124556942" watchObservedRunningTime="2026-04-16 22:21:55.882389998 +0000 UTC m=+503.125206059" Apr 16 22:22:26.856216 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:26.856138 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-5blj5" Apr 16 22:22:26.859079 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:26.859058 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84d7d5cfc6-gmtz6" Apr 16 22:22:28.028631 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.028600 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-gmtz6"] Apr 16 22:22:28.029055 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.028808 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-84d7d5cfc6-gmtz6" podUID="2242979c-1d41-449b-a586-be0a382752c9" containerName="manager" containerID="cri-o://3c96a2f3d05cb93be34e4d78154c6765a2eb360bb3f6d5eca5174a416dfd01f4" gracePeriod=10 Apr 16 22:22:28.056504 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.056482 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-dr69d"] Apr 16 22:22:28.061112 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.061094 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-dr69d" Apr 16 22:22:28.069200 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.069176 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-dr69d"] Apr 16 22:22:28.179113 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.179084 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dc8535b-b4c7-4da6-86a0-d78d1243c64d-cert\") pod \"kserve-controller-manager-84d7d5cfc6-dr69d\" (UID: \"3dc8535b-b4c7-4da6-86a0-d78d1243c64d\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-dr69d" Apr 16 22:22:28.179266 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.179140 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtsmz\" (UniqueName: \"kubernetes.io/projected/3dc8535b-b4c7-4da6-86a0-d78d1243c64d-kube-api-access-gtsmz\") pod \"kserve-controller-manager-84d7d5cfc6-dr69d\" (UID: \"3dc8535b-b4c7-4da6-86a0-d78d1243c64d\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-dr69d" Apr 16 22:22:28.254224 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.254200 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-gmtz6" Apr 16 22:22:28.280319 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.280261 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dc8535b-b4c7-4da6-86a0-d78d1243c64d-cert\") pod \"kserve-controller-manager-84d7d5cfc6-dr69d\" (UID: \"3dc8535b-b4c7-4da6-86a0-d78d1243c64d\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-dr69d" Apr 16 22:22:28.280428 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.280330 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtsmz\" (UniqueName: \"kubernetes.io/projected/3dc8535b-b4c7-4da6-86a0-d78d1243c64d-kube-api-access-gtsmz\") pod \"kserve-controller-manager-84d7d5cfc6-dr69d\" (UID: \"3dc8535b-b4c7-4da6-86a0-d78d1243c64d\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-dr69d" Apr 16 22:22:28.283107 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.283083 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dc8535b-b4c7-4da6-86a0-d78d1243c64d-cert\") pod \"kserve-controller-manager-84d7d5cfc6-dr69d\" (UID: \"3dc8535b-b4c7-4da6-86a0-d78d1243c64d\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-dr69d" Apr 16 22:22:28.294425 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.294399 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtsmz\" (UniqueName: \"kubernetes.io/projected/3dc8535b-b4c7-4da6-86a0-d78d1243c64d-kube-api-access-gtsmz\") pod \"kserve-controller-manager-84d7d5cfc6-dr69d\" (UID: \"3dc8535b-b4c7-4da6-86a0-d78d1243c64d\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-dr69d" Apr 16 22:22:28.381242 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.381214 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2242979c-1d41-449b-a586-be0a382752c9-cert\") pod \"2242979c-1d41-449b-a586-be0a382752c9\" (UID: \"2242979c-1d41-449b-a586-be0a382752c9\") " Apr 16 22:22:28.381468 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.381266 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f27jt\" (UniqueName: \"kubernetes.io/projected/2242979c-1d41-449b-a586-be0a382752c9-kube-api-access-f27jt\") pod \"2242979c-1d41-449b-a586-be0a382752c9\" (UID: \"2242979c-1d41-449b-a586-be0a382752c9\") " Apr 16 22:22:28.383374 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.383342 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2242979c-1d41-449b-a586-be0a382752c9-cert" (OuterVolumeSpecName: "cert") pod "2242979c-1d41-449b-a586-be0a382752c9" (UID: "2242979c-1d41-449b-a586-be0a382752c9"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:22:28.383374 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.383364 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2242979c-1d41-449b-a586-be0a382752c9-kube-api-access-f27jt" (OuterVolumeSpecName: "kube-api-access-f27jt") pod "2242979c-1d41-449b-a586-be0a382752c9" (UID: "2242979c-1d41-449b-a586-be0a382752c9"). InnerVolumeSpecName "kube-api-access-f27jt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:22:28.410645 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.410619 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-dr69d" Apr 16 22:22:28.482390 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.482362 2574 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2242979c-1d41-449b-a586-be0a382752c9-cert\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:22:28.482390 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.482389 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f27jt\" (UniqueName: \"kubernetes.io/projected/2242979c-1d41-449b-a586-be0a382752c9-kube-api-access-f27jt\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:22:28.531415 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.531392 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-dr69d"] Apr 16 22:22:28.533753 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:22:28.533723 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dc8535b_b4c7_4da6_86a0_d78d1243c64d.slice/crio-38f5ef44d991e5b862be2ed2f57128faa33cf1fd82e2a2e858f65992e92fc25a WatchSource:0}: Error finding container 38f5ef44d991e5b862be2ed2f57128faa33cf1fd82e2a2e858f65992e92fc25a: Status 404 returned error can't find the container with id 38f5ef44d991e5b862be2ed2f57128faa33cf1fd82e2a2e858f65992e92fc25a Apr 16 22:22:28.946800 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.946771 2574 generic.go:358] "Generic (PLEG): container finished" podID="2242979c-1d41-449b-a586-be0a382752c9" containerID="3c96a2f3d05cb93be34e4d78154c6765a2eb360bb3f6d5eca5174a416dfd01f4" exitCode=0 Apr 16 22:22:28.947006 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.946832 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-gmtz6" Apr 16 22:22:28.947006 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.946862 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-gmtz6" event={"ID":"2242979c-1d41-449b-a586-be0a382752c9","Type":"ContainerDied","Data":"3c96a2f3d05cb93be34e4d78154c6765a2eb360bb3f6d5eca5174a416dfd01f4"} Apr 16 22:22:28.947006 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.946897 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-gmtz6" event={"ID":"2242979c-1d41-449b-a586-be0a382752c9","Type":"ContainerDied","Data":"e11e744f3bea48cd9589ffc6caef187eaf53cd79e6b98cef1a0a218e8fbbf318"} Apr 16 22:22:28.947006 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.946914 2574 scope.go:117] "RemoveContainer" containerID="3c96a2f3d05cb93be34e4d78154c6765a2eb360bb3f6d5eca5174a416dfd01f4" Apr 16 22:22:28.947982 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.947959 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-dr69d" event={"ID":"3dc8535b-b4c7-4da6-86a0-d78d1243c64d","Type":"ContainerStarted","Data":"38f5ef44d991e5b862be2ed2f57128faa33cf1fd82e2a2e858f65992e92fc25a"} Apr 16 22:22:28.972888 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.972845 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-gmtz6"] Apr 16 22:22:28.979754 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.979733 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-gmtz6"] Apr 16 22:22:28.990484 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.990465 2574 scope.go:117] "RemoveContainer" containerID="3c96a2f3d05cb93be34e4d78154c6765a2eb360bb3f6d5eca5174a416dfd01f4" Apr 16 22:22:28.990767 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:22:28.990748 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c96a2f3d05cb93be34e4d78154c6765a2eb360bb3f6d5eca5174a416dfd01f4\": container with ID starting with 3c96a2f3d05cb93be34e4d78154c6765a2eb360bb3f6d5eca5174a416dfd01f4 not found: ID does not exist" containerID="3c96a2f3d05cb93be34e4d78154c6765a2eb360bb3f6d5eca5174a416dfd01f4" Apr 16 22:22:28.990816 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:28.990776 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c96a2f3d05cb93be34e4d78154c6765a2eb360bb3f6d5eca5174a416dfd01f4"} err="failed to get container status \"3c96a2f3d05cb93be34e4d78154c6765a2eb360bb3f6d5eca5174a416dfd01f4\": rpc error: code = NotFound desc = could not find container \"3c96a2f3d05cb93be34e4d78154c6765a2eb360bb3f6d5eca5174a416dfd01f4\": container with ID starting with 3c96a2f3d05cb93be34e4d78154c6765a2eb360bb3f6d5eca5174a416dfd01f4 not found: ID does not exist" Apr 16 22:22:29.448488 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:29.448449 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2242979c-1d41-449b-a586-be0a382752c9" path="/var/lib/kubelet/pods/2242979c-1d41-449b-a586-be0a382752c9/volumes" Apr 16 22:22:29.953019 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:29.952983 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-dr69d" event={"ID":"3dc8535b-b4c7-4da6-86a0-d78d1243c64d","Type":"ContainerStarted","Data":"a55398270ad2e0e481713e534200c8e03ca91e4c958b12d61d0fc57fde21763a"} Apr 16 22:22:29.953168 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:29.953107 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84d7d5cfc6-dr69d" Apr 16 22:22:29.969993 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:22:29.969917 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84d7d5cfc6-dr69d" podStartSLOduration=1.421232692 podStartE2EDuration="1.969901389s" podCreationTimestamp="2026-04-16 22:22:28 +0000 UTC" firstStartedPulling="2026-04-16 22:22:28.535014757 +0000 UTC m=+535.777830796" lastFinishedPulling="2026-04-16 22:22:29.08368345 +0000 UTC m=+536.326499493" observedRunningTime="2026-04-16 22:22:29.968883337 +0000 UTC m=+537.211699398" watchObservedRunningTime="2026-04-16 22:22:29.969901389 +0000 UTC m=+537.212717451" Apr 16 22:23:00.960390 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:00.960358 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84d7d5cfc6-dr69d" Apr 16 22:23:18.855298 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:18.855261 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f75776864-449jw"] Apr 16 22:23:33.339843 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:33.339816 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 22:23:33.341435 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:33.341414 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 22:23:38.108634 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.108601 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt"] Apr 16 22:23:38.110827 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.108851 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2242979c-1d41-449b-a586-be0a382752c9" containerName="manager" Apr 16 22:23:38.110827 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.108861 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2242979c-1d41-449b-a586-be0a382752c9" containerName="manager" Apr 16 22:23:38.110827 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.108907 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2242979c-1d41-449b-a586-be0a382752c9" containerName="manager" Apr 16 22:23:38.111748 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.111733 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" Apr 16 22:23:38.114460 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.114428 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 22:23:38.114589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.114460 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-37a6e-kube-rbac-proxy-sar-config\"" Apr 16 22:23:38.114589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.114474 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4hqzk\"" Apr 16 22:23:38.114589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.114509 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 22:23:38.114589 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.114511 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-37a6e-predictor-serving-cert\"" Apr 16 22:23:38.126282 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.126257 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt"] Apr 16 22:23:38.201876 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.201833 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq9qg\" (UniqueName: \"kubernetes.io/projected/9ec70367-30a9-4fa5-9cbe-428dbcfda3a6-kube-api-access-lq9qg\") pod \"success-200-isvc-37a6e-predictor-848875cf66-xkpmt\" (UID: \"9ec70367-30a9-4fa5-9cbe-428dbcfda3a6\") " pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" Apr 16 22:23:38.202056 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.201900 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-37a6e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ec70367-30a9-4fa5-9cbe-428dbcfda3a6-success-200-isvc-37a6e-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-37a6e-predictor-848875cf66-xkpmt\" (UID: \"9ec70367-30a9-4fa5-9cbe-428dbcfda3a6\") " pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" Apr 16 22:23:38.202056 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.201955 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ec70367-30a9-4fa5-9cbe-428dbcfda3a6-proxy-tls\") pod \"success-200-isvc-37a6e-predictor-848875cf66-xkpmt\" (UID: \"9ec70367-30a9-4fa5-9cbe-428dbcfda3a6\") " pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" Apr 16 22:23:38.302702 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.302665 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lq9qg\" (UniqueName: \"kubernetes.io/projected/9ec70367-30a9-4fa5-9cbe-428dbcfda3a6-kube-api-access-lq9qg\") pod \"success-200-isvc-37a6e-predictor-848875cf66-xkpmt\" (UID: \"9ec70367-30a9-4fa5-9cbe-428dbcfda3a6\") " pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" Apr 16 22:23:38.302896 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.302732 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-37a6e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ec70367-30a9-4fa5-9cbe-428dbcfda3a6-success-200-isvc-37a6e-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-37a6e-predictor-848875cf66-xkpmt\" (UID: \"9ec70367-30a9-4fa5-9cbe-428dbcfda3a6\") " pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" Apr 16 22:23:38.302896 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.302767 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ec70367-30a9-4fa5-9cbe-428dbcfda3a6-proxy-tls\") pod \"success-200-isvc-37a6e-predictor-848875cf66-xkpmt\" (UID: \"9ec70367-30a9-4fa5-9cbe-428dbcfda3a6\") " pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" Apr 16 22:23:38.302896 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:23:38.302890 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-serving-cert: secret "success-200-isvc-37a6e-predictor-serving-cert" not found Apr 16 22:23:38.303102 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:23:38.303030 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ec70367-30a9-4fa5-9cbe-428dbcfda3a6-proxy-tls podName:9ec70367-30a9-4fa5-9cbe-428dbcfda3a6 nodeName:}" failed. No retries permitted until 2026-04-16 22:23:38.803006376 +0000 UTC m=+606.045822416 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9ec70367-30a9-4fa5-9cbe-428dbcfda3a6-proxy-tls") pod "success-200-isvc-37a6e-predictor-848875cf66-xkpmt" (UID: "9ec70367-30a9-4fa5-9cbe-428dbcfda3a6") : secret "success-200-isvc-37a6e-predictor-serving-cert" not found Apr 16 22:23:38.303398 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.303373 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-37a6e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ec70367-30a9-4fa5-9cbe-428dbcfda3a6-success-200-isvc-37a6e-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-37a6e-predictor-848875cf66-xkpmt\" (UID: \"9ec70367-30a9-4fa5-9cbe-428dbcfda3a6\") " pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" Apr 16 22:23:38.310610 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.310583 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq9qg\" (UniqueName: \"kubernetes.io/projected/9ec70367-30a9-4fa5-9cbe-428dbcfda3a6-kube-api-access-lq9qg\") pod \"success-200-isvc-37a6e-predictor-848875cf66-xkpmt\" (UID: \"9ec70367-30a9-4fa5-9cbe-428dbcfda3a6\") " pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" Apr 16 22:23:38.380692 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.380615 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5"] Apr 16 22:23:38.383898 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.383879 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" Apr 16 22:23:38.386037 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.386018 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-predictor-serving-cert\"" Apr 16 22:23:38.386137 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.386018 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-kube-rbac-proxy-sar-config\"" Apr 16 22:23:38.392846 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.392819 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5"] Apr 16 22:23:38.504801 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.504773 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/54c63a46-7546-4114-b513-114ee537f8f7-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qx4w5\" (UID: \"54c63a46-7546-4114-b513-114ee537f8f7\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" Apr 16 22:23:38.504987 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.504815 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q2q7\" (UniqueName: \"kubernetes.io/projected/54c63a46-7546-4114-b513-114ee537f8f7-kube-api-access-5q2q7\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qx4w5\" (UID: \"54c63a46-7546-4114-b513-114ee537f8f7\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" Apr 16 22:23:38.504987 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.504853 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54c63a46-7546-4114-b513-114ee537f8f7-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qx4w5\" (UID: \"54c63a46-7546-4114-b513-114ee537f8f7\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" Apr 16 22:23:38.504987 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.504900 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/54c63a46-7546-4114-b513-114ee537f8f7-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qx4w5\" (UID: \"54c63a46-7546-4114-b513-114ee537f8f7\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" Apr 16 22:23:38.605896 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.605871 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5q2q7\" (UniqueName: \"kubernetes.io/projected/54c63a46-7546-4114-b513-114ee537f8f7-kube-api-access-5q2q7\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qx4w5\" (UID: \"54c63a46-7546-4114-b513-114ee537f8f7\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" Apr 16 22:23:38.606063 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.605910 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54c63a46-7546-4114-b513-114ee537f8f7-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qx4w5\" (UID: \"54c63a46-7546-4114-b513-114ee537f8f7\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" Apr 16 22:23:38.606063 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.605978 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/54c63a46-7546-4114-b513-114ee537f8f7-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qx4w5\" (UID: \"54c63a46-7546-4114-b513-114ee537f8f7\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" Apr 16 22:23:38.606063 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.606010 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/54c63a46-7546-4114-b513-114ee537f8f7-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qx4w5\" (UID: \"54c63a46-7546-4114-b513-114ee537f8f7\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" Apr 16 22:23:38.606376 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.606354 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54c63a46-7546-4114-b513-114ee537f8f7-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qx4w5\" (UID: \"54c63a46-7546-4114-b513-114ee537f8f7\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" Apr 16 22:23:38.606617 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.606595 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/54c63a46-7546-4114-b513-114ee537f8f7-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qx4w5\" (UID: \"54c63a46-7546-4114-b513-114ee537f8f7\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" Apr 16 22:23:38.608203 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.608187 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/54c63a46-7546-4114-b513-114ee537f8f7-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qx4w5\" (UID: \"54c63a46-7546-4114-b513-114ee537f8f7\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" Apr 16 22:23:38.614018 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.613997 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q2q7\" (UniqueName: \"kubernetes.io/projected/54c63a46-7546-4114-b513-114ee537f8f7-kube-api-access-5q2q7\") pod \"isvc-xgboost-graph-predictor-669d8d6456-qx4w5\" (UID: \"54c63a46-7546-4114-b513-114ee537f8f7\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" Apr 16 22:23:38.696395 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.696307 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" Apr 16 22:23:38.808649 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.808614 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ec70367-30a9-4fa5-9cbe-428dbcfda3a6-proxy-tls\") pod \"success-200-isvc-37a6e-predictor-848875cf66-xkpmt\" (UID: \"9ec70367-30a9-4fa5-9cbe-428dbcfda3a6\") " pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" Apr 16 22:23:38.810844 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.810827 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ec70367-30a9-4fa5-9cbe-428dbcfda3a6-proxy-tls\") pod \"success-200-isvc-37a6e-predictor-848875cf66-xkpmt\" (UID: \"9ec70367-30a9-4fa5-9cbe-428dbcfda3a6\") " pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" Apr 16 22:23:38.814697 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:38.814673 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5"] Apr 16 22:23:38.817606 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:23:38.817582 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54c63a46_7546_4114_b513_114ee537f8f7.slice/crio-a0cc69b320b1d0373b74ef945672271a79c978bf63cfd9e984e2dc6b54d442f2 WatchSource:0}: Error finding container a0cc69b320b1d0373b74ef945672271a79c978bf63cfd9e984e2dc6b54d442f2: Status 404 returned error can't find the container with id a0cc69b320b1d0373b74ef945672271a79c978bf63cfd9e984e2dc6b54d442f2 Apr 16 22:23:39.021222 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:39.021155 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" Apr 16 22:23:39.136840 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:39.136713 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt"] Apr 16 22:23:39.139282 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:23:39.139257 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ec70367_30a9_4fa5_9cbe_428dbcfda3a6.slice/crio-a54bfcdeac0ad240cbf5c7fe31e716926c149e72301cb5948ee86727361ad82f WatchSource:0}: Error finding container a54bfcdeac0ad240cbf5c7fe31e716926c149e72301cb5948ee86727361ad82f: Status 404 returned error can't find the container with id a54bfcdeac0ad240cbf5c7fe31e716926c149e72301cb5948ee86727361ad82f Apr 16 22:23:39.141777 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:39.141754 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" event={"ID":"54c63a46-7546-4114-b513-114ee537f8f7","Type":"ContainerStarted","Data":"a0cc69b320b1d0373b74ef945672271a79c978bf63cfd9e984e2dc6b54d442f2"} Apr 16 22:23:40.148548 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:40.148507 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" event={"ID":"9ec70367-30a9-4fa5-9cbe-428dbcfda3a6","Type":"ContainerStarted","Data":"a54bfcdeac0ad240cbf5c7fe31e716926c149e72301cb5948ee86727361ad82f"} Apr 16 22:23:43.162489 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:43.162444 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" event={"ID":"54c63a46-7546-4114-b513-114ee537f8f7","Type":"ContainerStarted","Data":"450286c8adce4ea73081304c00694795c40e026c9652191c44143f726e9a3ab1"} Apr 16 22:23:43.879186 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:43.879116 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6f75776864-449jw" podUID="7a3f46cc-780b-4bc5-8dc9-793c584581df" containerName="console" containerID="cri-o://525c94e4a55f39018309ba782cb559341682705474e29d3854e7b1cd1b6d3f11" gracePeriod=15 Apr 16 22:23:44.167478 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:44.167400 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f75776864-449jw_7a3f46cc-780b-4bc5-8dc9-793c584581df/console/0.log" Apr 16 22:23:44.167478 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:44.167448 2574 generic.go:358] "Generic (PLEG): container finished" podID="7a3f46cc-780b-4bc5-8dc9-793c584581df" containerID="525c94e4a55f39018309ba782cb559341682705474e29d3854e7b1cd1b6d3f11" exitCode=2 Apr 16 22:23:44.167900 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:44.167530 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f75776864-449jw" event={"ID":"7a3f46cc-780b-4bc5-8dc9-793c584581df","Type":"ContainerDied","Data":"525c94e4a55f39018309ba782cb559341682705474e29d3854e7b1cd1b6d3f11"} Apr 16 22:23:48.180624 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.180584 2574 generic.go:358] "Generic (PLEG): container finished" podID="54c63a46-7546-4114-b513-114ee537f8f7" containerID="450286c8adce4ea73081304c00694795c40e026c9652191c44143f726e9a3ab1" exitCode=0 Apr 16 22:23:48.181075 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.180671 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" event={"ID":"54c63a46-7546-4114-b513-114ee537f8f7","Type":"ContainerDied","Data":"450286c8adce4ea73081304c00694795c40e026c9652191c44143f726e9a3ab1"} Apr 16 22:23:48.729900 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.729874 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f75776864-449jw_7a3f46cc-780b-4bc5-8dc9-793c584581df/console/0.log" Apr 16 22:23:48.730109 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.729956 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f75776864-449jw" Apr 16 22:23:48.803903 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.803870 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-console-config\") pod \"7a3f46cc-780b-4bc5-8dc9-793c584581df\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " Apr 16 22:23:48.804104 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.803916 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpfqb\" (UniqueName: \"kubernetes.io/projected/7a3f46cc-780b-4bc5-8dc9-793c584581df-kube-api-access-vpfqb\") pod \"7a3f46cc-780b-4bc5-8dc9-793c584581df\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " Apr 16 22:23:48.804104 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.803965 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-service-ca\") pod \"7a3f46cc-780b-4bc5-8dc9-793c584581df\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " Apr 16 22:23:48.804104 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.803984 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-trusted-ca-bundle\") pod \"7a3f46cc-780b-4bc5-8dc9-793c584581df\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " Apr 16 22:23:48.804104 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.804011 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a3f46cc-780b-4bc5-8dc9-793c584581df-console-serving-cert\") pod \"7a3f46cc-780b-4bc5-8dc9-793c584581df\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " Apr 16 22:23:48.804104 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.804074 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a3f46cc-780b-4bc5-8dc9-793c584581df-console-oauth-config\") pod \"7a3f46cc-780b-4bc5-8dc9-793c584581df\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " Apr 16 22:23:48.804368 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.804112 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-oauth-serving-cert\") pod \"7a3f46cc-780b-4bc5-8dc9-793c584581df\" (UID: \"7a3f46cc-780b-4bc5-8dc9-793c584581df\") " Apr 16 22:23:48.804550 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.804504 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7a3f46cc-780b-4bc5-8dc9-793c584581df" (UID: "7a3f46cc-780b-4bc5-8dc9-793c584581df"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:23:48.804550 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.804530 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-service-ca" (OuterVolumeSpecName: "service-ca") pod "7a3f46cc-780b-4bc5-8dc9-793c584581df" (UID: "7a3f46cc-780b-4bc5-8dc9-793c584581df"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:23:48.804728 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.804590 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7a3f46cc-780b-4bc5-8dc9-793c584581df" (UID: "7a3f46cc-780b-4bc5-8dc9-793c584581df"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:23:48.804794 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.804754 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-console-config" (OuterVolumeSpecName: "console-config") pod "7a3f46cc-780b-4bc5-8dc9-793c584581df" (UID: "7a3f46cc-780b-4bc5-8dc9-793c584581df"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:23:48.806603 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.806570 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3f46cc-780b-4bc5-8dc9-793c584581df-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7a3f46cc-780b-4bc5-8dc9-793c584581df" (UID: "7a3f46cc-780b-4bc5-8dc9-793c584581df"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:23:48.806721 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.806676 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3f46cc-780b-4bc5-8dc9-793c584581df-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7a3f46cc-780b-4bc5-8dc9-793c584581df" (UID: "7a3f46cc-780b-4bc5-8dc9-793c584581df"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:23:48.806721 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.806708 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a3f46cc-780b-4bc5-8dc9-793c584581df-kube-api-access-vpfqb" (OuterVolumeSpecName: "kube-api-access-vpfqb") pod "7a3f46cc-780b-4bc5-8dc9-793c584581df" (UID: "7a3f46cc-780b-4bc5-8dc9-793c584581df"). InnerVolumeSpecName "kube-api-access-vpfqb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:23:48.905585 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.905553 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a3f46cc-780b-4bc5-8dc9-793c584581df-console-oauth-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:23:48.905585 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.905582 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-oauth-serving-cert\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:23:48.905794 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.905595 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-console-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:23:48.905794 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.905606 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vpfqb\" (UniqueName: \"kubernetes.io/projected/7a3f46cc-780b-4bc5-8dc9-793c584581df-kube-api-access-vpfqb\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:23:48.905794 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.905615 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-service-ca\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:23:48.905794 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.905624 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a3f46cc-780b-4bc5-8dc9-793c584581df-trusted-ca-bundle\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:23:48.905794 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:48.905632 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a3f46cc-780b-4bc5-8dc9-793c584581df-console-serving-cert\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:23:49.186673 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:49.186641 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f75776864-449jw_7a3f46cc-780b-4bc5-8dc9-793c584581df/console/0.log" Apr 16 22:23:49.187117 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:49.186737 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f75776864-449jw" event={"ID":"7a3f46cc-780b-4bc5-8dc9-793c584581df","Type":"ContainerDied","Data":"dfb92671ab9332cdb4366c5c7bc53dedf373c808869294342275ea568b190e4d"} Apr 16 22:23:49.187117 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:49.186770 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f75776864-449jw" Apr 16 22:23:49.187117 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:49.186779 2574 scope.go:117] "RemoveContainer" containerID="525c94e4a55f39018309ba782cb559341682705474e29d3854e7b1cd1b6d3f11" Apr 16 22:23:49.208599 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:49.208569 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f75776864-449jw"] Apr 16 22:23:49.214280 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:49.214248 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6f75776864-449jw"] Apr 16 22:23:49.450237 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:49.450150 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a3f46cc-780b-4bc5-8dc9-793c584581df" path="/var/lib/kubelet/pods/7a3f46cc-780b-4bc5-8dc9-793c584581df/volumes" Apr 16 22:23:51.197433 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:51.197392 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" event={"ID":"9ec70367-30a9-4fa5-9cbe-428dbcfda3a6","Type":"ContainerStarted","Data":"c75f0136c41e7d656ed7685bd6b1f799017377bae159cead05ca292e8c0fc29c"} Apr 16 22:23:54.211827 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:54.211726 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" event={"ID":"9ec70367-30a9-4fa5-9cbe-428dbcfda3a6","Type":"ContainerStarted","Data":"7eed3bf2e3f3fdb8bfbeef05ce19d7d38160c5534fd1fb0895b6409fef4f3721"} Apr 16 22:23:54.212354 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:54.212024 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" Apr 16 22:23:54.212354 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:54.212058 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" Apr 16 22:23:54.213171 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:54.213145 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" podUID="9ec70367-30a9-4fa5-9cbe-428dbcfda3a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 16 22:23:54.229666 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:54.229615 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" podStartSLOduration=1.366639377 podStartE2EDuration="16.229597779s" podCreationTimestamp="2026-04-16 22:23:38 +0000 UTC" firstStartedPulling="2026-04-16 22:23:39.141006994 +0000 UTC m=+606.383823036" lastFinishedPulling="2026-04-16 22:23:54.003965394 +0000 UTC m=+621.246781438" observedRunningTime="2026-04-16 22:23:54.227553999 +0000 UTC m=+621.470370061" watchObservedRunningTime="2026-04-16 22:23:54.229597779 +0000 UTC m=+621.472413843" Apr 16 22:23:55.214818 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:23:55.214782 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" podUID="9ec70367-30a9-4fa5-9cbe-428dbcfda3a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 16 22:24:00.221688 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:24:00.221660 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" Apr 16 22:24:00.222336 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:24:00.222295 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" podUID="9ec70367-30a9-4fa5-9cbe-428dbcfda3a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 16 22:24:09.259530 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:24:09.259451 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" event={"ID":"54c63a46-7546-4114-b513-114ee537f8f7","Type":"ContainerStarted","Data":"492771801ad3ee7be372f875dc34f49b6340c581569fbd890877d85fcdf96b48"} Apr 16 22:24:09.259530 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:24:09.259487 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" event={"ID":"54c63a46-7546-4114-b513-114ee537f8f7","Type":"ContainerStarted","Data":"a27445f174d15445b7066fda709e87a129eb26a7b70ff1735f02b7fe6cba92a5"} Apr 16 22:24:09.260043 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:24:09.259762 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" Apr 16 22:24:09.277189 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:24:09.277141 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" podStartSLOduration=1.222685389 podStartE2EDuration="31.277129491s" podCreationTimestamp="2026-04-16 22:23:38 +0000 UTC" firstStartedPulling="2026-04-16 22:23:38.819285933 +0000 UTC m=+606.062101972" lastFinishedPulling="2026-04-16 22:24:08.873730035 +0000 UTC m=+636.116546074" observedRunningTime="2026-04-16 22:24:09.275118558 +0000 UTC m=+636.517934619" watchObservedRunningTime="2026-04-16 22:24:09.277129491 +0000 UTC m=+636.519945552" Apr 16 22:24:10.222696 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:24:10.222656 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" podUID="9ec70367-30a9-4fa5-9cbe-428dbcfda3a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 16 22:24:10.262559 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:24:10.262531 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" Apr 16 22:24:10.263473 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:24:10.263453 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" podUID="54c63a46-7546-4114-b513-114ee537f8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 16 22:24:11.265292 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:24:11.265248 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" podUID="54c63a46-7546-4114-b513-114ee537f8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 16 22:24:16.270856 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:24:16.270828 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" Apr 16 22:24:16.271450 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:24:16.271422 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" podUID="54c63a46-7546-4114-b513-114ee537f8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 16 22:24:20.222851 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:24:20.222809 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" podUID="9ec70367-30a9-4fa5-9cbe-428dbcfda3a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 16 22:24:26.271524 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:24:26.271482 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" podUID="54c63a46-7546-4114-b513-114ee537f8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 16 22:24:30.223095 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:24:30.223052 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" podUID="9ec70367-30a9-4fa5-9cbe-428dbcfda3a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 16 22:24:36.272324 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:24:36.272281 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" podUID="54c63a46-7546-4114-b513-114ee537f8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 16 22:24:40.223099 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:24:40.223068 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" Apr 16 22:24:46.271999 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:24:46.271957 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" podUID="54c63a46-7546-4114-b513-114ee537f8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 16 22:24:56.271991 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:24:56.271953 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" podUID="54c63a46-7546-4114-b513-114ee537f8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 16 22:25:06.271749 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:06.271707 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" podUID="54c63a46-7546-4114-b513-114ee537f8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 16 22:25:12.231360 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.231288 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt"] Apr 16 22:25:12.231719 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.231565 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" podUID="9ec70367-30a9-4fa5-9cbe-428dbcfda3a6" containerName="kserve-container" containerID="cri-o://c75f0136c41e7d656ed7685bd6b1f799017377bae159cead05ca292e8c0fc29c" gracePeriod=30 Apr 16 22:25:12.231719 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.231596 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" podUID="9ec70367-30a9-4fa5-9cbe-428dbcfda3a6" containerName="kube-rbac-proxy" containerID="cri-o://7eed3bf2e3f3fdb8bfbeef05ce19d7d38160c5534fd1fb0895b6409fef4f3721" gracePeriod=30 Apr 16 22:25:12.302844 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.302815 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft"] Apr 16 22:25:12.303192 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.303175 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a3f46cc-780b-4bc5-8dc9-793c584581df" containerName="console" Apr 16 22:25:12.303279 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.303195 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3f46cc-780b-4bc5-8dc9-793c584581df" containerName="console" Apr 16 22:25:12.303279 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.303261 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a3f46cc-780b-4bc5-8dc9-793c584581df" containerName="console" Apr 16 22:25:12.306496 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.306475 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" Apr 16 22:25:12.309596 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.309571 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-73a06-predictor-serving-cert\"" Apr 16 22:25:12.309709 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.309592 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-73a06-kube-rbac-proxy-sar-config\"" Apr 16 22:25:12.316429 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.316405 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft"] Apr 16 22:25:12.357761 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.357730 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-73a06-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/95077faa-d05d-4fd0-94f2-1632bdffbddf-success-200-isvc-73a06-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-73a06-predictor-78588c44d-wh7ft\" (UID: \"95077faa-d05d-4fd0-94f2-1632bdffbddf\") " pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" Apr 16 22:25:12.357880 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.357766 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcknm\" (UniqueName: \"kubernetes.io/projected/95077faa-d05d-4fd0-94f2-1632bdffbddf-kube-api-access-fcknm\") pod \"success-200-isvc-73a06-predictor-78588c44d-wh7ft\" (UID: \"95077faa-d05d-4fd0-94f2-1632bdffbddf\") " pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" Apr 16 22:25:12.357880 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.357814 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95077faa-d05d-4fd0-94f2-1632bdffbddf-proxy-tls\") pod \"success-200-isvc-73a06-predictor-78588c44d-wh7ft\" (UID: \"95077faa-d05d-4fd0-94f2-1632bdffbddf\") " pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" Apr 16 22:25:12.429919 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.429882 2574 generic.go:358] "Generic (PLEG): container finished" podID="9ec70367-30a9-4fa5-9cbe-428dbcfda3a6" containerID="7eed3bf2e3f3fdb8bfbeef05ce19d7d38160c5534fd1fb0895b6409fef4f3721" exitCode=2 Apr 16 22:25:12.430072 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.429968 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" event={"ID":"9ec70367-30a9-4fa5-9cbe-428dbcfda3a6","Type":"ContainerDied","Data":"7eed3bf2e3f3fdb8bfbeef05ce19d7d38160c5534fd1fb0895b6409fef4f3721"} Apr 16 22:25:12.458941 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.458898 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95077faa-d05d-4fd0-94f2-1632bdffbddf-proxy-tls\") pod \"success-200-isvc-73a06-predictor-78588c44d-wh7ft\" (UID: \"95077faa-d05d-4fd0-94f2-1632bdffbddf\") " pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" Apr 16 22:25:12.459072 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.458985 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-73a06-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/95077faa-d05d-4fd0-94f2-1632bdffbddf-success-200-isvc-73a06-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-73a06-predictor-78588c44d-wh7ft\" (UID: \"95077faa-d05d-4fd0-94f2-1632bdffbddf\") " pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" Apr 16 22:25:12.459072 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.459010 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcknm\" (UniqueName: \"kubernetes.io/projected/95077faa-d05d-4fd0-94f2-1632bdffbddf-kube-api-access-fcknm\") pod \"success-200-isvc-73a06-predictor-78588c44d-wh7ft\" (UID: \"95077faa-d05d-4fd0-94f2-1632bdffbddf\") " pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" Apr 16 22:25:12.459072 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:25:12.459047 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-73a06-predictor-serving-cert: secret "success-200-isvc-73a06-predictor-serving-cert" not found Apr 16 22:25:12.459204 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:25:12.459109 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95077faa-d05d-4fd0-94f2-1632bdffbddf-proxy-tls podName:95077faa-d05d-4fd0-94f2-1632bdffbddf nodeName:}" failed. No retries permitted until 2026-04-16 22:25:12.959089514 +0000 UTC m=+700.201905554 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/95077faa-d05d-4fd0-94f2-1632bdffbddf-proxy-tls") pod "success-200-isvc-73a06-predictor-78588c44d-wh7ft" (UID: "95077faa-d05d-4fd0-94f2-1632bdffbddf") : secret "success-200-isvc-73a06-predictor-serving-cert" not found Apr 16 22:25:12.459557 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.459538 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-73a06-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/95077faa-d05d-4fd0-94f2-1632bdffbddf-success-200-isvc-73a06-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-73a06-predictor-78588c44d-wh7ft\" (UID: \"95077faa-d05d-4fd0-94f2-1632bdffbddf\") " pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" Apr 16 22:25:12.467519 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.467491 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcknm\" (UniqueName: \"kubernetes.io/projected/95077faa-d05d-4fd0-94f2-1632bdffbddf-kube-api-access-fcknm\") pod \"success-200-isvc-73a06-predictor-78588c44d-wh7ft\" (UID: \"95077faa-d05d-4fd0-94f2-1632bdffbddf\") " pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" Apr 16 22:25:12.963101 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.963066 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95077faa-d05d-4fd0-94f2-1632bdffbddf-proxy-tls\") pod \"success-200-isvc-73a06-predictor-78588c44d-wh7ft\" (UID: \"95077faa-d05d-4fd0-94f2-1632bdffbddf\") " pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" Apr 16 22:25:12.965407 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:12.965389 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95077faa-d05d-4fd0-94f2-1632bdffbddf-proxy-tls\") pod \"success-200-isvc-73a06-predictor-78588c44d-wh7ft\" (UID: \"95077faa-d05d-4fd0-94f2-1632bdffbddf\") " pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" Apr 16 22:25:13.217043 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:13.216965 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" Apr 16 22:25:13.333510 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:13.333481 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft"] Apr 16 22:25:13.336872 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:25:13.336845 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95077faa_d05d_4fd0_94f2_1632bdffbddf.slice/crio-76f529fbe6444d904277478b81d29c118fbd22e462696a712581fef390eba714 WatchSource:0}: Error finding container 76f529fbe6444d904277478b81d29c118fbd22e462696a712581fef390eba714: Status 404 returned error can't find the container with id 76f529fbe6444d904277478b81d29c118fbd22e462696a712581fef390eba714 Apr 16 22:25:13.338958 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:13.338943 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:25:13.437460 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:13.437426 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" event={"ID":"95077faa-d05d-4fd0-94f2-1632bdffbddf","Type":"ContainerStarted","Data":"9478cfc66f716ec178f3845b759d991e6d6617d2c936f2153632818afc380522"} Apr 16 22:25:13.437562 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:13.437467 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" event={"ID":"95077faa-d05d-4fd0-94f2-1632bdffbddf","Type":"ContainerStarted","Data":"76f529fbe6444d904277478b81d29c118fbd22e462696a712581fef390eba714"} Apr 16 22:25:14.441400 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:14.441364 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" event={"ID":"95077faa-d05d-4fd0-94f2-1632bdffbddf","Type":"ContainerStarted","Data":"5c67c6af61a3ff1d4288a1a301ad2171f0574e065c8912cdabdc539b828aba7c"} Apr 16 22:25:14.441768 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:14.441491 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" Apr 16 22:25:14.458962 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:14.458904 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" podStartSLOduration=2.458889837 podStartE2EDuration="2.458889837s" podCreationTimestamp="2026-04-16 22:25:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:25:14.457167383 +0000 UTC m=+701.699983445" watchObservedRunningTime="2026-04-16 22:25:14.458889837 +0000 UTC m=+701.701705897" Apr 16 22:25:15.268302 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.268281 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" Apr 16 22:25:15.384420 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.384387 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-37a6e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ec70367-30a9-4fa5-9cbe-428dbcfda3a6-success-200-isvc-37a6e-kube-rbac-proxy-sar-config\") pod \"9ec70367-30a9-4fa5-9cbe-428dbcfda3a6\" (UID: \"9ec70367-30a9-4fa5-9cbe-428dbcfda3a6\") " Apr 16 22:25:15.384603 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.384444 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ec70367-30a9-4fa5-9cbe-428dbcfda3a6-proxy-tls\") pod \"9ec70367-30a9-4fa5-9cbe-428dbcfda3a6\" (UID: \"9ec70367-30a9-4fa5-9cbe-428dbcfda3a6\") " Apr 16 22:25:15.384603 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.384465 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq9qg\" (UniqueName: \"kubernetes.io/projected/9ec70367-30a9-4fa5-9cbe-428dbcfda3a6-kube-api-access-lq9qg\") pod \"9ec70367-30a9-4fa5-9cbe-428dbcfda3a6\" (UID: \"9ec70367-30a9-4fa5-9cbe-428dbcfda3a6\") " Apr 16 22:25:15.384768 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.384743 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ec70367-30a9-4fa5-9cbe-428dbcfda3a6-success-200-isvc-37a6e-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-37a6e-kube-rbac-proxy-sar-config") pod "9ec70367-30a9-4fa5-9cbe-428dbcfda3a6" (UID: "9ec70367-30a9-4fa5-9cbe-428dbcfda3a6"). InnerVolumeSpecName "success-200-isvc-37a6e-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:25:15.386636 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.386610 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ec70367-30a9-4fa5-9cbe-428dbcfda3a6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9ec70367-30a9-4fa5-9cbe-428dbcfda3a6" (UID: "9ec70367-30a9-4fa5-9cbe-428dbcfda3a6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:25:15.386725 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.386686 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ec70367-30a9-4fa5-9cbe-428dbcfda3a6-kube-api-access-lq9qg" (OuterVolumeSpecName: "kube-api-access-lq9qg") pod "9ec70367-30a9-4fa5-9cbe-428dbcfda3a6" (UID: "9ec70367-30a9-4fa5-9cbe-428dbcfda3a6"). InnerVolumeSpecName "kube-api-access-lq9qg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:25:15.445475 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.445441 2574 generic.go:358] "Generic (PLEG): container finished" podID="9ec70367-30a9-4fa5-9cbe-428dbcfda3a6" containerID="c75f0136c41e7d656ed7685bd6b1f799017377bae159cead05ca292e8c0fc29c" exitCode=0 Apr 16 22:25:15.445884 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.445574 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" Apr 16 22:25:15.447448 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.447416 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" podUID="95077faa-d05d-4fd0-94f2-1632bdffbddf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 22:25:15.448284 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.448270 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" Apr 16 22:25:15.448351 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.448289 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" event={"ID":"9ec70367-30a9-4fa5-9cbe-428dbcfda3a6","Type":"ContainerDied","Data":"c75f0136c41e7d656ed7685bd6b1f799017377bae159cead05ca292e8c0fc29c"} Apr 16 22:25:15.448351 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.448305 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" event={"ID":"9ec70367-30a9-4fa5-9cbe-428dbcfda3a6","Type":"ContainerDied","Data":"a54bfcdeac0ad240cbf5c7fe31e716926c149e72301cb5948ee86727361ad82f"} Apr 16 22:25:15.448351 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.448320 2574 scope.go:117] "RemoveContainer" containerID="7eed3bf2e3f3fdb8bfbeef05ce19d7d38160c5534fd1fb0895b6409fef4f3721" Apr 16 22:25:15.457234 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.457212 2574 scope.go:117] "RemoveContainer" containerID="c75f0136c41e7d656ed7685bd6b1f799017377bae159cead05ca292e8c0fc29c" Apr 16 22:25:15.464137 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.464120 2574 scope.go:117] "RemoveContainer" containerID="7eed3bf2e3f3fdb8bfbeef05ce19d7d38160c5534fd1fb0895b6409fef4f3721" Apr 16 22:25:15.464384 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:25:15.464367 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eed3bf2e3f3fdb8bfbeef05ce19d7d38160c5534fd1fb0895b6409fef4f3721\": container with ID starting with 7eed3bf2e3f3fdb8bfbeef05ce19d7d38160c5534fd1fb0895b6409fef4f3721 not found: ID does not exist" containerID="7eed3bf2e3f3fdb8bfbeef05ce19d7d38160c5534fd1fb0895b6409fef4f3721" Apr 16 22:25:15.464434 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.464391 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eed3bf2e3f3fdb8bfbeef05ce19d7d38160c5534fd1fb0895b6409fef4f3721"} err="failed to get container status \"7eed3bf2e3f3fdb8bfbeef05ce19d7d38160c5534fd1fb0895b6409fef4f3721\": rpc error: code = NotFound desc = could not find container \"7eed3bf2e3f3fdb8bfbeef05ce19d7d38160c5534fd1fb0895b6409fef4f3721\": container with ID starting with 7eed3bf2e3f3fdb8bfbeef05ce19d7d38160c5534fd1fb0895b6409fef4f3721 not found: ID does not exist" Apr 16 22:25:15.464434 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.464409 2574 scope.go:117] "RemoveContainer" containerID="c75f0136c41e7d656ed7685bd6b1f799017377bae159cead05ca292e8c0fc29c" Apr 16 22:25:15.464657 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:25:15.464638 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c75f0136c41e7d656ed7685bd6b1f799017377bae159cead05ca292e8c0fc29c\": container with ID starting with c75f0136c41e7d656ed7685bd6b1f799017377bae159cead05ca292e8c0fc29c not found: ID does not exist" containerID="c75f0136c41e7d656ed7685bd6b1f799017377bae159cead05ca292e8c0fc29c" Apr 16 22:25:15.464701 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.464663 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c75f0136c41e7d656ed7685bd6b1f799017377bae159cead05ca292e8c0fc29c"} err="failed to get container status \"c75f0136c41e7d656ed7685bd6b1f799017377bae159cead05ca292e8c0fc29c\": rpc error: code = NotFound desc = could not find container \"c75f0136c41e7d656ed7685bd6b1f799017377bae159cead05ca292e8c0fc29c\": container with ID starting with c75f0136c41e7d656ed7685bd6b1f799017377bae159cead05ca292e8c0fc29c not found: ID does not exist" Apr 16 22:25:15.471080 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.471060 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt"] Apr 16 22:25:15.472204 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.472186 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt"] Apr 16 22:25:15.484897 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.484875 2574 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-37a6e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ec70367-30a9-4fa5-9cbe-428dbcfda3a6-success-200-isvc-37a6e-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:25:15.484897 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.484895 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ec70367-30a9-4fa5-9cbe-428dbcfda3a6-proxy-tls\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:25:15.485011 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:15.484907 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lq9qg\" (UniqueName: \"kubernetes.io/projected/9ec70367-30a9-4fa5-9cbe-428dbcfda3a6-kube-api-access-lq9qg\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:25:16.215856 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:16.215811 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-37a6e-predictor-848875cf66-xkpmt" podUID="9ec70367-30a9-4fa5-9cbe-428dbcfda3a6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.20:8643/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 16 22:25:16.272131 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:16.272101 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" Apr 16 22:25:16.450649 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:16.450611 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" podUID="95077faa-d05d-4fd0-94f2-1632bdffbddf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 22:25:17.448554 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:17.448520 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ec70367-30a9-4fa5-9cbe-428dbcfda3a6" path="/var/lib/kubelet/pods/9ec70367-30a9-4fa5-9cbe-428dbcfda3a6/volumes" Apr 16 22:25:21.455380 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:21.455351 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" Apr 16 22:25:21.455861 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:21.455817 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" podUID="95077faa-d05d-4fd0-94f2-1632bdffbddf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 22:25:31.455848 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:31.455807 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" podUID="95077faa-d05d-4fd0-94f2-1632bdffbddf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 22:25:41.456273 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:41.456233 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" podUID="95077faa-d05d-4fd0-94f2-1632bdffbddf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 22:25:48.140614 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.140582 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86"] Apr 16 22:25:48.141030 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.140855 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ec70367-30a9-4fa5-9cbe-428dbcfda3a6" containerName="kube-rbac-proxy" Apr 16 22:25:48.141030 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.140865 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec70367-30a9-4fa5-9cbe-428dbcfda3a6" containerName="kube-rbac-proxy" Apr 16 22:25:48.141030 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.140884 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ec70367-30a9-4fa5-9cbe-428dbcfda3a6" containerName="kserve-container" Apr 16 22:25:48.141030 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.140891 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec70367-30a9-4fa5-9cbe-428dbcfda3a6" containerName="kserve-container" Apr 16 22:25:48.141030 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.140958 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ec70367-30a9-4fa5-9cbe-428dbcfda3a6" containerName="kube-rbac-proxy" Apr 16 22:25:48.141030 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.140968 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ec70367-30a9-4fa5-9cbe-428dbcfda3a6" containerName="kserve-container" Apr 16 22:25:48.142807 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.142789 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" Apr 16 22:25:48.145118 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.145096 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-d7c57-predictor-serving-cert\"" Apr 16 22:25:48.145238 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.145101 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-d7c57-kube-rbac-proxy-sar-config\"" Apr 16 22:25:48.157492 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.157468 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86"] Apr 16 22:25:48.191278 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.191247 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5"] Apr 16 22:25:48.191548 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.191528 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" podUID="54c63a46-7546-4114-b513-114ee537f8f7" containerName="kserve-container" containerID="cri-o://a27445f174d15445b7066fda709e87a129eb26a7b70ff1735f02b7fe6cba92a5" gracePeriod=30 Apr 16 22:25:48.191642 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.191618 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" podUID="54c63a46-7546-4114-b513-114ee537f8f7" containerName="kube-rbac-proxy" containerID="cri-o://492771801ad3ee7be372f875dc34f49b6340c581569fbd890877d85fcdf96b48" gracePeriod=30 Apr 16 22:25:48.242944 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.242896 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvhlb\" (UniqueName: \"kubernetes.io/projected/481e9eac-3b6d-454f-b426-0daafedf5843-kube-api-access-nvhlb\") pod \"success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86\" (UID: \"481e9eac-3b6d-454f-b426-0daafedf5843\") " pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" Apr 16 22:25:48.243102 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.242962 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-d7c57-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/481e9eac-3b6d-454f-b426-0daafedf5843-success-200-isvc-d7c57-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86\" (UID: \"481e9eac-3b6d-454f-b426-0daafedf5843\") " pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" Apr 16 22:25:48.243102 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.243057 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/481e9eac-3b6d-454f-b426-0daafedf5843-proxy-tls\") pod \"success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86\" (UID: \"481e9eac-3b6d-454f-b426-0daafedf5843\") " pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" Apr 16 22:25:48.344069 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.344038 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-d7c57-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/481e9eac-3b6d-454f-b426-0daafedf5843-success-200-isvc-d7c57-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86\" (UID: \"481e9eac-3b6d-454f-b426-0daafedf5843\") " pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" Apr 16 22:25:48.344225 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.344093 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/481e9eac-3b6d-454f-b426-0daafedf5843-proxy-tls\") pod \"success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86\" (UID: \"481e9eac-3b6d-454f-b426-0daafedf5843\") " pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" Apr 16 22:25:48.344225 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.344133 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvhlb\" (UniqueName: \"kubernetes.io/projected/481e9eac-3b6d-454f-b426-0daafedf5843-kube-api-access-nvhlb\") pod \"success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86\" (UID: \"481e9eac-3b6d-454f-b426-0daafedf5843\") " pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" Apr 16 22:25:48.344331 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:25:48.344256 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-serving-cert: secret "success-200-isvc-d7c57-predictor-serving-cert" not found Apr 16 22:25:48.344331 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:25:48.344328 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/481e9eac-3b6d-454f-b426-0daafedf5843-proxy-tls podName:481e9eac-3b6d-454f-b426-0daafedf5843 nodeName:}" failed. No retries permitted until 2026-04-16 22:25:48.84430487 +0000 UTC m=+736.087120925 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/481e9eac-3b6d-454f-b426-0daafedf5843-proxy-tls") pod "success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" (UID: "481e9eac-3b6d-454f-b426-0daafedf5843") : secret "success-200-isvc-d7c57-predictor-serving-cert" not found Apr 16 22:25:48.344670 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.344648 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-d7c57-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/481e9eac-3b6d-454f-b426-0daafedf5843-success-200-isvc-d7c57-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86\" (UID: \"481e9eac-3b6d-454f-b426-0daafedf5843\") " pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" Apr 16 22:25:48.352164 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.352141 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvhlb\" (UniqueName: \"kubernetes.io/projected/481e9eac-3b6d-454f-b426-0daafedf5843-kube-api-access-nvhlb\") pod \"success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86\" (UID: \"481e9eac-3b6d-454f-b426-0daafedf5843\") " pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" Apr 16 22:25:48.535591 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.535486 2574 generic.go:358] "Generic (PLEG): container finished" podID="54c63a46-7546-4114-b513-114ee537f8f7" containerID="492771801ad3ee7be372f875dc34f49b6340c581569fbd890877d85fcdf96b48" exitCode=2 Apr 16 22:25:48.535591 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.535552 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" event={"ID":"54c63a46-7546-4114-b513-114ee537f8f7","Type":"ContainerDied","Data":"492771801ad3ee7be372f875dc34f49b6340c581569fbd890877d85fcdf96b48"} Apr 16 22:25:48.847606 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.847522 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/481e9eac-3b6d-454f-b426-0daafedf5843-proxy-tls\") pod \"success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86\" (UID: \"481e9eac-3b6d-454f-b426-0daafedf5843\") " pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" Apr 16 22:25:48.849865 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:48.849841 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/481e9eac-3b6d-454f-b426-0daafedf5843-proxy-tls\") pod \"success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86\" (UID: \"481e9eac-3b6d-454f-b426-0daafedf5843\") " pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" Apr 16 22:25:49.052704 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:49.052663 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" Apr 16 22:25:49.177462 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:49.177437 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86"] Apr 16 22:25:49.179951 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:25:49.179903 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod481e9eac_3b6d_454f_b426_0daafedf5843.slice/crio-795e8fbc61bb9fd3ee7410b451903f5b0ce6769fae13f993152277b3f334f007 WatchSource:0}: Error finding container 795e8fbc61bb9fd3ee7410b451903f5b0ce6769fae13f993152277b3f334f007: Status 404 returned error can't find the container with id 795e8fbc61bb9fd3ee7410b451903f5b0ce6769fae13f993152277b3f334f007 Apr 16 22:25:49.540465 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:49.540429 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" event={"ID":"481e9eac-3b6d-454f-b426-0daafedf5843","Type":"ContainerStarted","Data":"b7daec197913dac1ccc52574608a933df2e8e55e6188812a6d89a1bb992988b2"} Apr 16 22:25:49.540636 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:49.540469 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" event={"ID":"481e9eac-3b6d-454f-b426-0daafedf5843","Type":"ContainerStarted","Data":"a067593273bd020d229846e2b5b18be4945cca726cfe05707e971f61f5fcf614"} Apr 16 22:25:49.540636 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:49.540484 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" event={"ID":"481e9eac-3b6d-454f-b426-0daafedf5843","Type":"ContainerStarted","Data":"795e8fbc61bb9fd3ee7410b451903f5b0ce6769fae13f993152277b3f334f007"} Apr 16 22:25:49.540636 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:49.540536 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" Apr 16 22:25:49.558577 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:49.558532 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" podStartSLOduration=1.558518719 podStartE2EDuration="1.558518719s" podCreationTimestamp="2026-04-16 22:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:25:49.557688528 +0000 UTC m=+736.800504590" watchObservedRunningTime="2026-04-16 22:25:49.558518719 +0000 UTC m=+736.801334779" Apr 16 22:25:50.543340 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:50.543307 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" Apr 16 22:25:50.544556 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:50.544530 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" podUID="481e9eac-3b6d-454f-b426-0daafedf5843" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 22:25:51.266494 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:51.266451 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" podUID="54c63a46-7546-4114-b513-114ee537f8f7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.21:8643/healthz\": dial tcp 10.133.0.21:8643: connect: connection refused" Apr 16 22:25:51.456702 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:51.456668 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" podUID="95077faa-d05d-4fd0-94f2-1632bdffbddf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 22:25:51.546169 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:51.546083 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" podUID="481e9eac-3b6d-454f-b426-0daafedf5843" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 22:25:51.935338 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:51.935312 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" Apr 16 22:25:51.974276 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:51.974245 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/54c63a46-7546-4114-b513-114ee537f8f7-proxy-tls\") pod \"54c63a46-7546-4114-b513-114ee537f8f7\" (UID: \"54c63a46-7546-4114-b513-114ee537f8f7\") " Apr 16 22:25:51.974442 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:51.974295 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54c63a46-7546-4114-b513-114ee537f8f7-kserve-provision-location\") pod \"54c63a46-7546-4114-b513-114ee537f8f7\" (UID: \"54c63a46-7546-4114-b513-114ee537f8f7\") " Apr 16 22:25:51.974442 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:51.974328 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/54c63a46-7546-4114-b513-114ee537f8f7-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"54c63a46-7546-4114-b513-114ee537f8f7\" (UID: \"54c63a46-7546-4114-b513-114ee537f8f7\") " Apr 16 22:25:51.974442 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:51.974347 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q2q7\" (UniqueName: \"kubernetes.io/projected/54c63a46-7546-4114-b513-114ee537f8f7-kube-api-access-5q2q7\") pod \"54c63a46-7546-4114-b513-114ee537f8f7\" (UID: \"54c63a46-7546-4114-b513-114ee537f8f7\") " Apr 16 22:25:51.974644 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:51.974616 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54c63a46-7546-4114-b513-114ee537f8f7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "54c63a46-7546-4114-b513-114ee537f8f7" (UID: "54c63a46-7546-4114-b513-114ee537f8f7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:25:51.974695 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:51.974672 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54c63a46-7546-4114-b513-114ee537f8f7-isvc-xgboost-graph-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-kube-rbac-proxy-sar-config") pod "54c63a46-7546-4114-b513-114ee537f8f7" (UID: "54c63a46-7546-4114-b513-114ee537f8f7"). InnerVolumeSpecName "isvc-xgboost-graph-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:25:51.976399 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:51.976365 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54c63a46-7546-4114-b513-114ee537f8f7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "54c63a46-7546-4114-b513-114ee537f8f7" (UID: "54c63a46-7546-4114-b513-114ee537f8f7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:25:51.976504 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:51.976404 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54c63a46-7546-4114-b513-114ee537f8f7-kube-api-access-5q2q7" (OuterVolumeSpecName: "kube-api-access-5q2q7") pod "54c63a46-7546-4114-b513-114ee537f8f7" (UID: "54c63a46-7546-4114-b513-114ee537f8f7"). InnerVolumeSpecName "kube-api-access-5q2q7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:25:52.075034 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:52.074923 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/54c63a46-7546-4114-b513-114ee537f8f7-proxy-tls\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:25:52.075034 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:52.074979 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54c63a46-7546-4114-b513-114ee537f8f7-kserve-provision-location\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:25:52.075034 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:52.074993 2574 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/54c63a46-7546-4114-b513-114ee537f8f7-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:25:52.075034 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:52.075008 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5q2q7\" (UniqueName: \"kubernetes.io/projected/54c63a46-7546-4114-b513-114ee537f8f7-kube-api-access-5q2q7\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:25:52.550317 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:52.550275 2574 generic.go:358] "Generic (PLEG): container finished" podID="54c63a46-7546-4114-b513-114ee537f8f7" containerID="a27445f174d15445b7066fda709e87a129eb26a7b70ff1735f02b7fe6cba92a5" exitCode=0 Apr 16 22:25:52.550703 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:52.550354 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" event={"ID":"54c63a46-7546-4114-b513-114ee537f8f7","Type":"ContainerDied","Data":"a27445f174d15445b7066fda709e87a129eb26a7b70ff1735f02b7fe6cba92a5"} Apr 16 22:25:52.550703 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:52.550370 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" Apr 16 22:25:52.550703 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:52.550394 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5" event={"ID":"54c63a46-7546-4114-b513-114ee537f8f7","Type":"ContainerDied","Data":"a0cc69b320b1d0373b74ef945672271a79c978bf63cfd9e984e2dc6b54d442f2"} Apr 16 22:25:52.550703 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:52.550410 2574 scope.go:117] "RemoveContainer" containerID="492771801ad3ee7be372f875dc34f49b6340c581569fbd890877d85fcdf96b48" Apr 16 22:25:52.557909 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:52.557888 2574 scope.go:117] "RemoveContainer" containerID="a27445f174d15445b7066fda709e87a129eb26a7b70ff1735f02b7fe6cba92a5" Apr 16 22:25:52.564529 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:52.564508 2574 scope.go:117] "RemoveContainer" containerID="450286c8adce4ea73081304c00694795c40e026c9652191c44143f726e9a3ab1" Apr 16 22:25:52.570917 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:52.570895 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5"] Apr 16 22:25:52.571112 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:52.571064 2574 scope.go:117] "RemoveContainer" containerID="492771801ad3ee7be372f875dc34f49b6340c581569fbd890877d85fcdf96b48" Apr 16 22:25:52.571297 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:25:52.571276 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"492771801ad3ee7be372f875dc34f49b6340c581569fbd890877d85fcdf96b48\": container with ID starting with 492771801ad3ee7be372f875dc34f49b6340c581569fbd890877d85fcdf96b48 not found: ID does not exist" containerID="492771801ad3ee7be372f875dc34f49b6340c581569fbd890877d85fcdf96b48" Apr 16 22:25:52.571344 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:52.571307 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"492771801ad3ee7be372f875dc34f49b6340c581569fbd890877d85fcdf96b48"} err="failed to get container status \"492771801ad3ee7be372f875dc34f49b6340c581569fbd890877d85fcdf96b48\": rpc error: code = NotFound desc = could not find container \"492771801ad3ee7be372f875dc34f49b6340c581569fbd890877d85fcdf96b48\": container with ID starting with 492771801ad3ee7be372f875dc34f49b6340c581569fbd890877d85fcdf96b48 not found: ID does not exist" Apr 16 22:25:52.571344 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:52.571331 2574 scope.go:117] "RemoveContainer" containerID="a27445f174d15445b7066fda709e87a129eb26a7b70ff1735f02b7fe6cba92a5" Apr 16 22:25:52.571584 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:25:52.571560 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a27445f174d15445b7066fda709e87a129eb26a7b70ff1735f02b7fe6cba92a5\": container with ID starting with a27445f174d15445b7066fda709e87a129eb26a7b70ff1735f02b7fe6cba92a5 not found: ID does not exist" containerID="a27445f174d15445b7066fda709e87a129eb26a7b70ff1735f02b7fe6cba92a5" Apr 16 22:25:52.571664 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:52.571590 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a27445f174d15445b7066fda709e87a129eb26a7b70ff1735f02b7fe6cba92a5"} err="failed to get container status \"a27445f174d15445b7066fda709e87a129eb26a7b70ff1735f02b7fe6cba92a5\": rpc error: code = NotFound desc = could not find container \"a27445f174d15445b7066fda709e87a129eb26a7b70ff1735f02b7fe6cba92a5\": container with ID starting with a27445f174d15445b7066fda709e87a129eb26a7b70ff1735f02b7fe6cba92a5 not found: ID does not exist" Apr 16 22:25:52.571664 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:52.571607 2574 scope.go:117] "RemoveContainer" containerID="450286c8adce4ea73081304c00694795c40e026c9652191c44143f726e9a3ab1" Apr 16 22:25:52.571806 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:25:52.571789 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"450286c8adce4ea73081304c00694795c40e026c9652191c44143f726e9a3ab1\": container with ID starting with 450286c8adce4ea73081304c00694795c40e026c9652191c44143f726e9a3ab1 not found: ID does not exist" containerID="450286c8adce4ea73081304c00694795c40e026c9652191c44143f726e9a3ab1" Apr 16 22:25:52.571857 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:52.571815 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"450286c8adce4ea73081304c00694795c40e026c9652191c44143f726e9a3ab1"} err="failed to get container status \"450286c8adce4ea73081304c00694795c40e026c9652191c44143f726e9a3ab1\": rpc error: code = NotFound desc = could not find container \"450286c8adce4ea73081304c00694795c40e026c9652191c44143f726e9a3ab1\": container with ID starting with 450286c8adce4ea73081304c00694795c40e026c9652191c44143f726e9a3ab1 not found: ID does not exist" Apr 16 22:25:52.576127 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:52.576106 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-qx4w5"] Apr 16 22:25:53.451396 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:53.451358 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54c63a46-7546-4114-b513-114ee537f8f7" path="/var/lib/kubelet/pods/54c63a46-7546-4114-b513-114ee537f8f7/volumes" Apr 16 22:25:56.550813 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:56.550782 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" Apr 16 22:25:56.551365 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:25:56.551338 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" podUID="481e9eac-3b6d-454f-b426-0daafedf5843" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 22:26:01.457083 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:26:01.457057 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" Apr 16 22:26:06.551548 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:26:06.551512 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" podUID="481e9eac-3b6d-454f-b426-0daafedf5843" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 22:26:16.551359 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:26:16.551324 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" podUID="481e9eac-3b6d-454f-b426-0daafedf5843" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 22:26:26.551969 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:26:26.551906 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" podUID="481e9eac-3b6d-454f-b426-0daafedf5843" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 22:26:36.551955 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:26:36.551913 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" Apr 16 22:28:33.362427 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:28:33.362358 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 22:28:33.362427 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:28:33.362358 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 22:33:33.383040 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:33:33.383014 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 22:33:33.385626 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:33:33.384514 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 22:34:26.888027 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:26.887948 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft"] Apr 16 22:34:26.888491 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:26.888313 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" podUID="95077faa-d05d-4fd0-94f2-1632bdffbddf" containerName="kserve-container" containerID="cri-o://9478cfc66f716ec178f3845b759d991e6d6617d2c936f2153632818afc380522" gracePeriod=30 Apr 16 22:34:26.888491 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:26.888390 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" podUID="95077faa-d05d-4fd0-94f2-1632bdffbddf" containerName="kube-rbac-proxy" containerID="cri-o://5c67c6af61a3ff1d4288a1a301ad2171f0574e065c8912cdabdc539b828aba7c" gracePeriod=30 Apr 16 22:34:26.969423 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:26.969392 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl"] Apr 16 22:34:26.969660 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:26.969649 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54c63a46-7546-4114-b513-114ee537f8f7" containerName="kube-rbac-proxy" Apr 16 22:34:26.969702 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:26.969662 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c63a46-7546-4114-b513-114ee537f8f7" containerName="kube-rbac-proxy" Apr 16 22:34:26.969702 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:26.969674 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54c63a46-7546-4114-b513-114ee537f8f7" containerName="kserve-container" Apr 16 22:34:26.969702 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:26.969681 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c63a46-7546-4114-b513-114ee537f8f7" containerName="kserve-container" Apr 16 22:34:26.969702 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:26.969700 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54c63a46-7546-4114-b513-114ee537f8f7" containerName="storage-initializer" Apr 16 22:34:26.969821 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:26.969705 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c63a46-7546-4114-b513-114ee537f8f7" containerName="storage-initializer" Apr 16 22:34:26.969821 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:26.969745 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="54c63a46-7546-4114-b513-114ee537f8f7" containerName="kserve-container" Apr 16 22:34:26.969821 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:26.969754 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="54c63a46-7546-4114-b513-114ee537f8f7" containerName="kube-rbac-proxy" Apr 16 22:34:26.972672 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:26.972657 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" Apr 16 22:34:26.975241 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:26.975215 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-1148f-kube-rbac-proxy-sar-config\"" Apr 16 22:34:26.975241 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:26.975235 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-1148f-predictor-serving-cert\"" Apr 16 22:34:26.991146 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:26.991122 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl"] Apr 16 22:34:27.128601 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:27.128563 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmdgx\" (UniqueName: \"kubernetes.io/projected/75940772-1803-4b20-a926-4128e5d5deb5-kube-api-access-mmdgx\") pod \"success-200-isvc-1148f-predictor-7477c489b6-228hl\" (UID: \"75940772-1803-4b20-a926-4128e5d5deb5\") " pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" Apr 16 22:34:27.128779 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:27.128642 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-1148f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/75940772-1803-4b20-a926-4128e5d5deb5-success-200-isvc-1148f-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-1148f-predictor-7477c489b6-228hl\" (UID: \"75940772-1803-4b20-a926-4128e5d5deb5\") " pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" Apr 16 22:34:27.128779 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:27.128677 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75940772-1803-4b20-a926-4128e5d5deb5-proxy-tls\") pod \"success-200-isvc-1148f-predictor-7477c489b6-228hl\" (UID: \"75940772-1803-4b20-a926-4128e5d5deb5\") " pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" Apr 16 22:34:27.229246 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:27.229157 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmdgx\" (UniqueName: \"kubernetes.io/projected/75940772-1803-4b20-a926-4128e5d5deb5-kube-api-access-mmdgx\") pod \"success-200-isvc-1148f-predictor-7477c489b6-228hl\" (UID: \"75940772-1803-4b20-a926-4128e5d5deb5\") " pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" Apr 16 22:34:27.229246 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:27.229213 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-1148f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/75940772-1803-4b20-a926-4128e5d5deb5-success-200-isvc-1148f-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-1148f-predictor-7477c489b6-228hl\" (UID: \"75940772-1803-4b20-a926-4128e5d5deb5\") " pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" Apr 16 22:34:27.229246 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:27.229237 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75940772-1803-4b20-a926-4128e5d5deb5-proxy-tls\") pod \"success-200-isvc-1148f-predictor-7477c489b6-228hl\" (UID: \"75940772-1803-4b20-a926-4128e5d5deb5\") " pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" Apr 16 22:34:27.229461 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:34:27.229321 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-1148f-predictor-serving-cert: secret "success-200-isvc-1148f-predictor-serving-cert" not found Apr 16 22:34:27.229461 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:34:27.229383 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75940772-1803-4b20-a926-4128e5d5deb5-proxy-tls podName:75940772-1803-4b20-a926-4128e5d5deb5 nodeName:}" failed. No retries permitted until 2026-04-16 22:34:27.729361706 +0000 UTC m=+1254.972177744 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/75940772-1803-4b20-a926-4128e5d5deb5-proxy-tls") pod "success-200-isvc-1148f-predictor-7477c489b6-228hl" (UID: "75940772-1803-4b20-a926-4128e5d5deb5") : secret "success-200-isvc-1148f-predictor-serving-cert" not found Apr 16 22:34:27.229899 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:27.229877 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-1148f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/75940772-1803-4b20-a926-4128e5d5deb5-success-200-isvc-1148f-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-1148f-predictor-7477c489b6-228hl\" (UID: \"75940772-1803-4b20-a926-4128e5d5deb5\") " pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" Apr 16 22:34:27.237640 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:27.237613 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmdgx\" (UniqueName: \"kubernetes.io/projected/75940772-1803-4b20-a926-4128e5d5deb5-kube-api-access-mmdgx\") pod \"success-200-isvc-1148f-predictor-7477c489b6-228hl\" (UID: \"75940772-1803-4b20-a926-4128e5d5deb5\") " pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" Apr 16 22:34:27.734358 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:27.734313 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75940772-1803-4b20-a926-4128e5d5deb5-proxy-tls\") pod \"success-200-isvc-1148f-predictor-7477c489b6-228hl\" (UID: \"75940772-1803-4b20-a926-4128e5d5deb5\") " pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" Apr 16 22:34:27.734529 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:34:27.734454 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-1148f-predictor-serving-cert: secret "success-200-isvc-1148f-predictor-serving-cert" not found Apr 16 22:34:27.734529 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:34:27.734519 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75940772-1803-4b20-a926-4128e5d5deb5-proxy-tls podName:75940772-1803-4b20-a926-4128e5d5deb5 nodeName:}" failed. No retries permitted until 2026-04-16 22:34:28.734502706 +0000 UTC m=+1255.977318744 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/75940772-1803-4b20-a926-4128e5d5deb5-proxy-tls") pod "success-200-isvc-1148f-predictor-7477c489b6-228hl" (UID: "75940772-1803-4b20-a926-4128e5d5deb5") : secret "success-200-isvc-1148f-predictor-serving-cert" not found Apr 16 22:34:27.951708 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:27.951672 2574 generic.go:358] "Generic (PLEG): container finished" podID="95077faa-d05d-4fd0-94f2-1632bdffbddf" containerID="5c67c6af61a3ff1d4288a1a301ad2171f0574e065c8912cdabdc539b828aba7c" exitCode=2 Apr 16 22:34:27.951708 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:27.951692 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" event={"ID":"95077faa-d05d-4fd0-94f2-1632bdffbddf","Type":"ContainerDied","Data":"5c67c6af61a3ff1d4288a1a301ad2171f0574e065c8912cdabdc539b828aba7c"} Apr 16 22:34:28.741966 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:28.741915 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75940772-1803-4b20-a926-4128e5d5deb5-proxy-tls\") pod \"success-200-isvc-1148f-predictor-7477c489b6-228hl\" (UID: \"75940772-1803-4b20-a926-4128e5d5deb5\") " pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" Apr 16 22:34:28.744331 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:28.744308 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75940772-1803-4b20-a926-4128e5d5deb5-proxy-tls\") pod \"success-200-isvc-1148f-predictor-7477c489b6-228hl\" (UID: \"75940772-1803-4b20-a926-4128e5d5deb5\") " pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" Apr 16 22:34:28.784484 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:28.784449 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" Apr 16 22:34:28.903497 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:28.903465 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl"] Apr 16 22:34:28.906484 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:34:28.906457 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75940772_1803_4b20_a926_4128e5d5deb5.slice/crio-c38f31a40e9f0445d0d65cea392440f5083cbd66fecd8b7b2764e265526e0021 WatchSource:0}: Error finding container c38f31a40e9f0445d0d65cea392440f5083cbd66fecd8b7b2764e265526e0021: Status 404 returned error can't find the container with id c38f31a40e9f0445d0d65cea392440f5083cbd66fecd8b7b2764e265526e0021 Apr 16 22:34:28.908141 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:28.908125 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:34:28.955020 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:28.954992 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" event={"ID":"75940772-1803-4b20-a926-4128e5d5deb5","Type":"ContainerStarted","Data":"c38f31a40e9f0445d0d65cea392440f5083cbd66fecd8b7b2764e265526e0021"} Apr 16 22:34:29.959148 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:29.959117 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" event={"ID":"75940772-1803-4b20-a926-4128e5d5deb5","Type":"ContainerStarted","Data":"1c2a3feeb688956394a24cf2eef8213bbc7bc4f3532e0edb64c2bce891600542"} Apr 16 22:34:29.959148 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:29.959154 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" event={"ID":"75940772-1803-4b20-a926-4128e5d5deb5","Type":"ContainerStarted","Data":"4cb5d8d2a2e464a7fc3ed844cb29c827362efd92c3d6bf310e27ae49ce99e5e3"} Apr 16 22:34:29.959574 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:29.959284 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" Apr 16 22:34:29.979418 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:29.979370 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" podStartSLOduration=3.979357188 podStartE2EDuration="3.979357188s" podCreationTimestamp="2026-04-16 22:34:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:34:29.977909553 +0000 UTC m=+1257.220725613" watchObservedRunningTime="2026-04-16 22:34:29.979357188 +0000 UTC m=+1257.222173249" Apr 16 22:34:30.143853 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.143820 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" Apr 16 22:34:30.254912 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.254823 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcknm\" (UniqueName: \"kubernetes.io/projected/95077faa-d05d-4fd0-94f2-1632bdffbddf-kube-api-access-fcknm\") pod \"95077faa-d05d-4fd0-94f2-1632bdffbddf\" (UID: \"95077faa-d05d-4fd0-94f2-1632bdffbddf\") " Apr 16 22:34:30.254912 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.254876 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-73a06-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/95077faa-d05d-4fd0-94f2-1632bdffbddf-success-200-isvc-73a06-kube-rbac-proxy-sar-config\") pod \"95077faa-d05d-4fd0-94f2-1632bdffbddf\" (UID: \"95077faa-d05d-4fd0-94f2-1632bdffbddf\") " Apr 16 22:34:30.254912 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.254912 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95077faa-d05d-4fd0-94f2-1632bdffbddf-proxy-tls\") pod \"95077faa-d05d-4fd0-94f2-1632bdffbddf\" (UID: \"95077faa-d05d-4fd0-94f2-1632bdffbddf\") " Apr 16 22:34:30.255302 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.255278 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95077faa-d05d-4fd0-94f2-1632bdffbddf-success-200-isvc-73a06-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-73a06-kube-rbac-proxy-sar-config") pod "95077faa-d05d-4fd0-94f2-1632bdffbddf" (UID: "95077faa-d05d-4fd0-94f2-1632bdffbddf"). InnerVolumeSpecName "success-200-isvc-73a06-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:34:30.257078 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.257052 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95077faa-d05d-4fd0-94f2-1632bdffbddf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "95077faa-d05d-4fd0-94f2-1632bdffbddf" (UID: "95077faa-d05d-4fd0-94f2-1632bdffbddf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:34:30.257177 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.257103 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95077faa-d05d-4fd0-94f2-1632bdffbddf-kube-api-access-fcknm" (OuterVolumeSpecName: "kube-api-access-fcknm") pod "95077faa-d05d-4fd0-94f2-1632bdffbddf" (UID: "95077faa-d05d-4fd0-94f2-1632bdffbddf"). InnerVolumeSpecName "kube-api-access-fcknm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:34:30.355953 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.355898 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fcknm\" (UniqueName: \"kubernetes.io/projected/95077faa-d05d-4fd0-94f2-1632bdffbddf-kube-api-access-fcknm\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:34:30.355953 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.355953 2574 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-73a06-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/95077faa-d05d-4fd0-94f2-1632bdffbddf-success-200-isvc-73a06-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:34:30.356147 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.355971 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95077faa-d05d-4fd0-94f2-1632bdffbddf-proxy-tls\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:34:30.963480 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.963448 2574 generic.go:358] "Generic (PLEG): container finished" podID="95077faa-d05d-4fd0-94f2-1632bdffbddf" containerID="9478cfc66f716ec178f3845b759d991e6d6617d2c936f2153632818afc380522" exitCode=0 Apr 16 22:34:30.963899 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.963526 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" Apr 16 22:34:30.963899 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.963536 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" event={"ID":"95077faa-d05d-4fd0-94f2-1632bdffbddf","Type":"ContainerDied","Data":"9478cfc66f716ec178f3845b759d991e6d6617d2c936f2153632818afc380522"} Apr 16 22:34:30.963899 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.963573 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft" event={"ID":"95077faa-d05d-4fd0-94f2-1632bdffbddf","Type":"ContainerDied","Data":"76f529fbe6444d904277478b81d29c118fbd22e462696a712581fef390eba714"} Apr 16 22:34:30.963899 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.963593 2574 scope.go:117] "RemoveContainer" containerID="5c67c6af61a3ff1d4288a1a301ad2171f0574e065c8912cdabdc539b828aba7c" Apr 16 22:34:30.964189 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.964033 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" Apr 16 22:34:30.965599 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.965573 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" podUID="75940772-1803-4b20-a926-4128e5d5deb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 16 22:34:30.972039 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.972019 2574 scope.go:117] "RemoveContainer" containerID="9478cfc66f716ec178f3845b759d991e6d6617d2c936f2153632818afc380522" Apr 16 22:34:30.979107 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.979088 2574 scope.go:117] "RemoveContainer" containerID="5c67c6af61a3ff1d4288a1a301ad2171f0574e065c8912cdabdc539b828aba7c" Apr 16 22:34:30.979345 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:34:30.979324 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c67c6af61a3ff1d4288a1a301ad2171f0574e065c8912cdabdc539b828aba7c\": container with ID starting with 5c67c6af61a3ff1d4288a1a301ad2171f0574e065c8912cdabdc539b828aba7c not found: ID does not exist" containerID="5c67c6af61a3ff1d4288a1a301ad2171f0574e065c8912cdabdc539b828aba7c" Apr 16 22:34:30.979399 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.979353 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c67c6af61a3ff1d4288a1a301ad2171f0574e065c8912cdabdc539b828aba7c"} err="failed to get container status \"5c67c6af61a3ff1d4288a1a301ad2171f0574e065c8912cdabdc539b828aba7c\": rpc error: code = NotFound desc = could not find container \"5c67c6af61a3ff1d4288a1a301ad2171f0574e065c8912cdabdc539b828aba7c\": container with ID starting with 5c67c6af61a3ff1d4288a1a301ad2171f0574e065c8912cdabdc539b828aba7c not found: ID does not exist" Apr 16 22:34:30.979399 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.979371 2574 scope.go:117] "RemoveContainer" containerID="9478cfc66f716ec178f3845b759d991e6d6617d2c936f2153632818afc380522" Apr 16 22:34:30.979562 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:34:30.979544 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9478cfc66f716ec178f3845b759d991e6d6617d2c936f2153632818afc380522\": container with ID starting with 9478cfc66f716ec178f3845b759d991e6d6617d2c936f2153632818afc380522 not found: ID does not exist" containerID="9478cfc66f716ec178f3845b759d991e6d6617d2c936f2153632818afc380522" Apr 16 22:34:30.979602 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.979569 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9478cfc66f716ec178f3845b759d991e6d6617d2c936f2153632818afc380522"} err="failed to get container status \"9478cfc66f716ec178f3845b759d991e6d6617d2c936f2153632818afc380522\": rpc error: code = NotFound desc = could not find container \"9478cfc66f716ec178f3845b759d991e6d6617d2c936f2153632818afc380522\": container with ID starting with 9478cfc66f716ec178f3845b759d991e6d6617d2c936f2153632818afc380522 not found: ID does not exist" Apr 16 22:34:30.984878 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.984855 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft"] Apr 16 22:34:30.988723 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:30.988702 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-73a06-predictor-78588c44d-wh7ft"] Apr 16 22:34:31.447808 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:31.447778 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95077faa-d05d-4fd0-94f2-1632bdffbddf" path="/var/lib/kubelet/pods/95077faa-d05d-4fd0-94f2-1632bdffbddf/volumes" Apr 16 22:34:31.967278 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:31.967236 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" podUID="75940772-1803-4b20-a926-4128e5d5deb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 16 22:34:36.971772 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:36.971743 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" Apr 16 22:34:36.972293 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:36.972265 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" podUID="75940772-1803-4b20-a926-4128e5d5deb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 16 22:34:46.972464 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:46.972427 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" podUID="75940772-1803-4b20-a926-4128e5d5deb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 16 22:34:56.972375 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:34:56.972341 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" podUID="75940772-1803-4b20-a926-4128e5d5deb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 16 22:35:02.830798 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:02.830763 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86"] Apr 16 22:35:02.831438 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:02.831129 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" podUID="481e9eac-3b6d-454f-b426-0daafedf5843" containerName="kserve-container" containerID="cri-o://a067593273bd020d229846e2b5b18be4945cca726cfe05707e971f61f5fcf614" gracePeriod=30 Apr 16 22:35:02.831438 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:02.831168 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" podUID="481e9eac-3b6d-454f-b426-0daafedf5843" containerName="kube-rbac-proxy" containerID="cri-o://b7daec197913dac1ccc52574608a933df2e8e55e6188812a6d89a1bb992988b2" gracePeriod=30 Apr 16 22:35:02.863618 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:02.863592 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9"] Apr 16 22:35:02.863875 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:02.863864 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95077faa-d05d-4fd0-94f2-1632bdffbddf" containerName="kserve-container" Apr 16 22:35:02.863916 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:02.863876 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="95077faa-d05d-4fd0-94f2-1632bdffbddf" containerName="kserve-container" Apr 16 22:35:02.863916 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:02.863885 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95077faa-d05d-4fd0-94f2-1632bdffbddf" containerName="kube-rbac-proxy" Apr 16 22:35:02.863916 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:02.863890 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="95077faa-d05d-4fd0-94f2-1632bdffbddf" containerName="kube-rbac-proxy" Apr 16 22:35:02.864028 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:02.863945 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="95077faa-d05d-4fd0-94f2-1632bdffbddf" containerName="kube-rbac-proxy" Apr 16 22:35:02.864028 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:02.863955 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="95077faa-d05d-4fd0-94f2-1632bdffbddf" containerName="kserve-container" Apr 16 22:35:02.866945 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:02.866914 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" Apr 16 22:35:02.869285 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:02.869266 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-cf933-predictor-serving-cert\"" Apr 16 22:35:02.869377 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:02.869290 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-cf933-kube-rbac-proxy-sar-config\"" Apr 16 22:35:02.877400 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:02.877374 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9"] Apr 16 22:35:02.895047 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:02.895024 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbmnz\" (UniqueName: \"kubernetes.io/projected/16dd29ae-14fc-440c-95c1-b4ff1758bd52-kube-api-access-fbmnz\") pod \"success-200-isvc-cf933-predictor-c8d746ff8-62qm9\" (UID: \"16dd29ae-14fc-440c-95c1-b4ff1758bd52\") " pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" Apr 16 22:35:02.895163 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:02.895070 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16dd29ae-14fc-440c-95c1-b4ff1758bd52-proxy-tls\") pod \"success-200-isvc-cf933-predictor-c8d746ff8-62qm9\" (UID: \"16dd29ae-14fc-440c-95c1-b4ff1758bd52\") " pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" Apr 16 22:35:02.895258 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:02.895170 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-cf933-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/16dd29ae-14fc-440c-95c1-b4ff1758bd52-success-200-isvc-cf933-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-cf933-predictor-c8d746ff8-62qm9\" (UID: \"16dd29ae-14fc-440c-95c1-b4ff1758bd52\") " pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" Apr 16 22:35:02.996536 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:02.996501 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-cf933-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/16dd29ae-14fc-440c-95c1-b4ff1758bd52-success-200-isvc-cf933-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-cf933-predictor-c8d746ff8-62qm9\" (UID: \"16dd29ae-14fc-440c-95c1-b4ff1758bd52\") " pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" Apr 16 22:35:02.996684 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:02.996566 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbmnz\" (UniqueName: \"kubernetes.io/projected/16dd29ae-14fc-440c-95c1-b4ff1758bd52-kube-api-access-fbmnz\") pod \"success-200-isvc-cf933-predictor-c8d746ff8-62qm9\" (UID: \"16dd29ae-14fc-440c-95c1-b4ff1758bd52\") " pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" Apr 16 22:35:02.996684 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:02.996591 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16dd29ae-14fc-440c-95c1-b4ff1758bd52-proxy-tls\") pod \"success-200-isvc-cf933-predictor-c8d746ff8-62qm9\" (UID: \"16dd29ae-14fc-440c-95c1-b4ff1758bd52\") " pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" Apr 16 22:35:02.996826 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:35:02.996693 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-cf933-predictor-serving-cert: secret "success-200-isvc-cf933-predictor-serving-cert" not found Apr 16 22:35:02.996826 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:35:02.996756 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16dd29ae-14fc-440c-95c1-b4ff1758bd52-proxy-tls podName:16dd29ae-14fc-440c-95c1-b4ff1758bd52 nodeName:}" failed. No retries permitted until 2026-04-16 22:35:03.496740854 +0000 UTC m=+1290.739556893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/16dd29ae-14fc-440c-95c1-b4ff1758bd52-proxy-tls") pod "success-200-isvc-cf933-predictor-c8d746ff8-62qm9" (UID: "16dd29ae-14fc-440c-95c1-b4ff1758bd52") : secret "success-200-isvc-cf933-predictor-serving-cert" not found Apr 16 22:35:02.997295 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:02.997270 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-cf933-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/16dd29ae-14fc-440c-95c1-b4ff1758bd52-success-200-isvc-cf933-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-cf933-predictor-c8d746ff8-62qm9\" (UID: \"16dd29ae-14fc-440c-95c1-b4ff1758bd52\") " pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" Apr 16 22:35:03.005705 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:03.005680 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbmnz\" (UniqueName: \"kubernetes.io/projected/16dd29ae-14fc-440c-95c1-b4ff1758bd52-kube-api-access-fbmnz\") pod \"success-200-isvc-cf933-predictor-c8d746ff8-62qm9\" (UID: \"16dd29ae-14fc-440c-95c1-b4ff1758bd52\") " pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" Apr 16 22:35:03.052988 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:03.052955 2574 generic.go:358] "Generic (PLEG): container finished" podID="481e9eac-3b6d-454f-b426-0daafedf5843" containerID="b7daec197913dac1ccc52574608a933df2e8e55e6188812a6d89a1bb992988b2" exitCode=2 Apr 16 22:35:03.053109 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:03.053018 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" event={"ID":"481e9eac-3b6d-454f-b426-0daafedf5843","Type":"ContainerDied","Data":"b7daec197913dac1ccc52574608a933df2e8e55e6188812a6d89a1bb992988b2"} Apr 16 22:35:03.501351 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:03.501309 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16dd29ae-14fc-440c-95c1-b4ff1758bd52-proxy-tls\") pod \"success-200-isvc-cf933-predictor-c8d746ff8-62qm9\" (UID: \"16dd29ae-14fc-440c-95c1-b4ff1758bd52\") " pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" Apr 16 22:35:03.503702 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:03.503669 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16dd29ae-14fc-440c-95c1-b4ff1758bd52-proxy-tls\") pod \"success-200-isvc-cf933-predictor-c8d746ff8-62qm9\" (UID: \"16dd29ae-14fc-440c-95c1-b4ff1758bd52\") " pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" Apr 16 22:35:03.777046 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:03.776922 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" Apr 16 22:35:03.901209 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:03.901109 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9"] Apr 16 22:35:03.903454 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:35:03.903424 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16dd29ae_14fc_440c_95c1_b4ff1758bd52.slice/crio-0f0d72323a63b12e13e6498e3f7a8c85b30915da55c8f5dcedb578649551b5db WatchSource:0}: Error finding container 0f0d72323a63b12e13e6498e3f7a8c85b30915da55c8f5dcedb578649551b5db: Status 404 returned error can't find the container with id 0f0d72323a63b12e13e6498e3f7a8c85b30915da55c8f5dcedb578649551b5db Apr 16 22:35:04.057822 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:04.057737 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" event={"ID":"16dd29ae-14fc-440c-95c1-b4ff1758bd52","Type":"ContainerStarted","Data":"df0f65957d7981faa6162670a302342f396a480077a26a5e8dd761a80b40d82f"} Apr 16 22:35:04.057822 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:04.057771 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" event={"ID":"16dd29ae-14fc-440c-95c1-b4ff1758bd52","Type":"ContainerStarted","Data":"6b12ff915ee575dcd6c28e698266d372dc8e2f769f96e5934d0675a15c441c4f"} Apr 16 22:35:04.057822 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:04.057783 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" event={"ID":"16dd29ae-14fc-440c-95c1-b4ff1758bd52","Type":"ContainerStarted","Data":"0f0d72323a63b12e13e6498e3f7a8c85b30915da55c8f5dcedb578649551b5db"} Apr 16 22:35:04.058045 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:04.057877 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" Apr 16 22:35:04.075656 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:04.075615 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" podStartSLOduration=2.075598413 podStartE2EDuration="2.075598413s" podCreationTimestamp="2026-04-16 22:35:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:35:04.074025063 +0000 UTC m=+1291.316841124" watchObservedRunningTime="2026-04-16 22:35:04.075598413 +0000 UTC m=+1291.318414475" Apr 16 22:35:05.064700 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:05.063531 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" Apr 16 22:35:05.065097 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:05.064746 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" podUID="16dd29ae-14fc-440c-95c1-b4ff1758bd52" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 22:35:05.989413 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:05.989392 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" Apr 16 22:35:06.026495 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.026469 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/481e9eac-3b6d-454f-b426-0daafedf5843-proxy-tls\") pod \"481e9eac-3b6d-454f-b426-0daafedf5843\" (UID: \"481e9eac-3b6d-454f-b426-0daafedf5843\") " Apr 16 22:35:06.026616 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.026507 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-d7c57-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/481e9eac-3b6d-454f-b426-0daafedf5843-success-200-isvc-d7c57-kube-rbac-proxy-sar-config\") pod \"481e9eac-3b6d-454f-b426-0daafedf5843\" (UID: \"481e9eac-3b6d-454f-b426-0daafedf5843\") " Apr 16 22:35:06.026616 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.026553 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvhlb\" (UniqueName: \"kubernetes.io/projected/481e9eac-3b6d-454f-b426-0daafedf5843-kube-api-access-nvhlb\") pod \"481e9eac-3b6d-454f-b426-0daafedf5843\" (UID: \"481e9eac-3b6d-454f-b426-0daafedf5843\") " Apr 16 22:35:06.026897 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.026870 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/481e9eac-3b6d-454f-b426-0daafedf5843-success-200-isvc-d7c57-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-d7c57-kube-rbac-proxy-sar-config") pod "481e9eac-3b6d-454f-b426-0daafedf5843" (UID: "481e9eac-3b6d-454f-b426-0daafedf5843"). InnerVolumeSpecName "success-200-isvc-d7c57-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:35:06.028481 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.028449 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481e9eac-3b6d-454f-b426-0daafedf5843-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "481e9eac-3b6d-454f-b426-0daafedf5843" (UID: "481e9eac-3b6d-454f-b426-0daafedf5843"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:35:06.028559 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.028510 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/481e9eac-3b6d-454f-b426-0daafedf5843-kube-api-access-nvhlb" (OuterVolumeSpecName: "kube-api-access-nvhlb") pod "481e9eac-3b6d-454f-b426-0daafedf5843" (UID: "481e9eac-3b6d-454f-b426-0daafedf5843"). InnerVolumeSpecName "kube-api-access-nvhlb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:35:06.065143 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.065112 2574 generic.go:358] "Generic (PLEG): container finished" podID="481e9eac-3b6d-454f-b426-0daafedf5843" containerID="a067593273bd020d229846e2b5b18be4945cca726cfe05707e971f61f5fcf614" exitCode=0 Apr 16 22:35:06.065592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.065189 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" Apr 16 22:35:06.065592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.065187 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" event={"ID":"481e9eac-3b6d-454f-b426-0daafedf5843","Type":"ContainerDied","Data":"a067593273bd020d229846e2b5b18be4945cca726cfe05707e971f61f5fcf614"} Apr 16 22:35:06.065592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.065225 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86" event={"ID":"481e9eac-3b6d-454f-b426-0daafedf5843","Type":"ContainerDied","Data":"795e8fbc61bb9fd3ee7410b451903f5b0ce6769fae13f993152277b3f334f007"} Apr 16 22:35:06.065592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.065240 2574 scope.go:117] "RemoveContainer" containerID="b7daec197913dac1ccc52574608a933df2e8e55e6188812a6d89a1bb992988b2" Apr 16 22:35:06.065822 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.065629 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" podUID="16dd29ae-14fc-440c-95c1-b4ff1758bd52" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 22:35:06.072774 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.072753 2574 scope.go:117] "RemoveContainer" containerID="a067593273bd020d229846e2b5b18be4945cca726cfe05707e971f61f5fcf614" Apr 16 22:35:06.080389 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.080368 2574 scope.go:117] "RemoveContainer" containerID="b7daec197913dac1ccc52574608a933df2e8e55e6188812a6d89a1bb992988b2" Apr 16 22:35:06.080680 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:35:06.080662 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7daec197913dac1ccc52574608a933df2e8e55e6188812a6d89a1bb992988b2\": container with ID starting with b7daec197913dac1ccc52574608a933df2e8e55e6188812a6d89a1bb992988b2 not found: ID does not exist" containerID="b7daec197913dac1ccc52574608a933df2e8e55e6188812a6d89a1bb992988b2" Apr 16 22:35:06.080742 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.080688 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7daec197913dac1ccc52574608a933df2e8e55e6188812a6d89a1bb992988b2"} err="failed to get container status \"b7daec197913dac1ccc52574608a933df2e8e55e6188812a6d89a1bb992988b2\": rpc error: code = NotFound desc = could not find container \"b7daec197913dac1ccc52574608a933df2e8e55e6188812a6d89a1bb992988b2\": container with ID starting with b7daec197913dac1ccc52574608a933df2e8e55e6188812a6d89a1bb992988b2 not found: ID does not exist" Apr 16 22:35:06.080742 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.080707 2574 scope.go:117] "RemoveContainer" containerID="a067593273bd020d229846e2b5b18be4945cca726cfe05707e971f61f5fcf614" Apr 16 22:35:06.080973 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:35:06.080958 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a067593273bd020d229846e2b5b18be4945cca726cfe05707e971f61f5fcf614\": container with ID starting with a067593273bd020d229846e2b5b18be4945cca726cfe05707e971f61f5fcf614 not found: ID does not exist" containerID="a067593273bd020d229846e2b5b18be4945cca726cfe05707e971f61f5fcf614" Apr 16 22:35:06.081030 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.080976 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a067593273bd020d229846e2b5b18be4945cca726cfe05707e971f61f5fcf614"} err="failed to get container status \"a067593273bd020d229846e2b5b18be4945cca726cfe05707e971f61f5fcf614\": rpc error: code = NotFound desc = could not find container \"a067593273bd020d229846e2b5b18be4945cca726cfe05707e971f61f5fcf614\": container with ID starting with a067593273bd020d229846e2b5b18be4945cca726cfe05707e971f61f5fcf614 not found: ID does not exist" Apr 16 22:35:06.088710 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.088657 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86"] Apr 16 22:35:06.094783 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.094762 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d7c57-predictor-6557d7fbbb-8vv86"] Apr 16 22:35:06.127364 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.127332 2574 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-d7c57-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/481e9eac-3b6d-454f-b426-0daafedf5843-success-200-isvc-d7c57-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:35:06.127364 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.127366 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nvhlb\" (UniqueName: \"kubernetes.io/projected/481e9eac-3b6d-454f-b426-0daafedf5843-kube-api-access-nvhlb\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:35:06.127558 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.127381 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/481e9eac-3b6d-454f-b426-0daafedf5843-proxy-tls\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:35:06.972421 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:06.972382 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" podUID="75940772-1803-4b20-a926-4128e5d5deb5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 16 22:35:07.448297 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:07.448265 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="481e9eac-3b6d-454f-b426-0daafedf5843" path="/var/lib/kubelet/pods/481e9eac-3b6d-454f-b426-0daafedf5843/volumes" Apr 16 22:35:11.069591 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:11.069564 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" Apr 16 22:35:11.070132 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:11.070105 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" podUID="16dd29ae-14fc-440c-95c1-b4ff1758bd52" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 22:35:16.973152 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:16.973119 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" Apr 16 22:35:21.070462 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:21.070420 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" podUID="16dd29ae-14fc-440c-95c1-b4ff1758bd52" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 22:35:31.070100 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:31.070055 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" podUID="16dd29ae-14fc-440c-95c1-b4ff1758bd52" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 22:35:41.070599 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:41.070553 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" podUID="16dd29ae-14fc-440c-95c1-b4ff1758bd52" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 22:35:47.187255 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.187224 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl"] Apr 16 22:35:47.187643 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.187495 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" podUID="75940772-1803-4b20-a926-4128e5d5deb5" containerName="kserve-container" containerID="cri-o://4cb5d8d2a2e464a7fc3ed844cb29c827362efd92c3d6bf310e27ae49ce99e5e3" gracePeriod=30 Apr 16 22:35:47.187643 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.187544 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" podUID="75940772-1803-4b20-a926-4128e5d5deb5" containerName="kube-rbac-proxy" containerID="cri-o://1c2a3feeb688956394a24cf2eef8213bbc7bc4f3532e0edb64c2bce891600542" gracePeriod=30 Apr 16 22:35:47.216944 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.216901 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww"] Apr 16 22:35:47.217271 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.217254 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="481e9eac-3b6d-454f-b426-0daafedf5843" containerName="kube-rbac-proxy" Apr 16 22:35:47.217349 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.217274 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="481e9eac-3b6d-454f-b426-0daafedf5843" containerName="kube-rbac-proxy" Apr 16 22:35:47.217349 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.217285 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="481e9eac-3b6d-454f-b426-0daafedf5843" containerName="kserve-container" Apr 16 22:35:47.217349 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.217293 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="481e9eac-3b6d-454f-b426-0daafedf5843" containerName="kserve-container" Apr 16 22:35:47.217505 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.217354 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="481e9eac-3b6d-454f-b426-0daafedf5843" containerName="kube-rbac-proxy" Apr 16 22:35:47.217505 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.217369 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="481e9eac-3b6d-454f-b426-0daafedf5843" containerName="kserve-container" Apr 16 22:35:47.221583 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.221562 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" Apr 16 22:35:47.224024 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.224004 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-ea3bc-kube-rbac-proxy-sar-config\"" Apr 16 22:35:47.224136 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.224031 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-ea3bc-predictor-serving-cert\"" Apr 16 22:35:47.231909 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.231884 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww"] Apr 16 22:35:47.328622 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.328590 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-ea3bc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a618200-841c-40c4-a676-e4845f4aed14-success-200-isvc-ea3bc-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-ea3bc-predictor-5c55999dbb-67zww\" (UID: \"9a618200-841c-40c4-a676-e4845f4aed14\") " pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" Apr 16 22:35:47.328784 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.328648 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a618200-841c-40c4-a676-e4845f4aed14-proxy-tls\") pod \"success-200-isvc-ea3bc-predictor-5c55999dbb-67zww\" (UID: \"9a618200-841c-40c4-a676-e4845f4aed14\") " pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" Apr 16 22:35:47.328784 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.328684 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s88tf\" (UniqueName: \"kubernetes.io/projected/9a618200-841c-40c4-a676-e4845f4aed14-kube-api-access-s88tf\") pod \"success-200-isvc-ea3bc-predictor-5c55999dbb-67zww\" (UID: \"9a618200-841c-40c4-a676-e4845f4aed14\") " pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" Apr 16 22:35:47.429551 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.429518 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-ea3bc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a618200-841c-40c4-a676-e4845f4aed14-success-200-isvc-ea3bc-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-ea3bc-predictor-5c55999dbb-67zww\" (UID: \"9a618200-841c-40c4-a676-e4845f4aed14\") " pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" Apr 16 22:35:47.429720 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.429562 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a618200-841c-40c4-a676-e4845f4aed14-proxy-tls\") pod \"success-200-isvc-ea3bc-predictor-5c55999dbb-67zww\" (UID: \"9a618200-841c-40c4-a676-e4845f4aed14\") " pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" Apr 16 22:35:47.429720 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.429682 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s88tf\" (UniqueName: \"kubernetes.io/projected/9a618200-841c-40c4-a676-e4845f4aed14-kube-api-access-s88tf\") pod \"success-200-isvc-ea3bc-predictor-5c55999dbb-67zww\" (UID: \"9a618200-841c-40c4-a676-e4845f4aed14\") " pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" Apr 16 22:35:47.430273 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.430247 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-ea3bc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a618200-841c-40c4-a676-e4845f4aed14-success-200-isvc-ea3bc-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-ea3bc-predictor-5c55999dbb-67zww\" (UID: \"9a618200-841c-40c4-a676-e4845f4aed14\") " pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" Apr 16 22:35:47.431917 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.431895 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a618200-841c-40c4-a676-e4845f4aed14-proxy-tls\") pod \"success-200-isvc-ea3bc-predictor-5c55999dbb-67zww\" (UID: \"9a618200-841c-40c4-a676-e4845f4aed14\") " pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" Apr 16 22:35:47.437640 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.437583 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s88tf\" (UniqueName: \"kubernetes.io/projected/9a618200-841c-40c4-a676-e4845f4aed14-kube-api-access-s88tf\") pod \"success-200-isvc-ea3bc-predictor-5c55999dbb-67zww\" (UID: \"9a618200-841c-40c4-a676-e4845f4aed14\") " pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" Apr 16 22:35:47.536648 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.536479 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" Apr 16 22:35:47.655825 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:47.655791 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww"] Apr 16 22:35:47.658661 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:35:47.658634 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a618200_841c_40c4_a676_e4845f4aed14.slice/crio-99ff579124b98ff4bd45a5697254358ce2daebdf9fef15766a017588e6e39c59 WatchSource:0}: Error finding container 99ff579124b98ff4bd45a5697254358ce2daebdf9fef15766a017588e6e39c59: Status 404 returned error can't find the container with id 99ff579124b98ff4bd45a5697254358ce2daebdf9fef15766a017588e6e39c59 Apr 16 22:35:48.189530 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:48.189495 2574 generic.go:358] "Generic (PLEG): container finished" podID="75940772-1803-4b20-a926-4128e5d5deb5" containerID="1c2a3feeb688956394a24cf2eef8213bbc7bc4f3532e0edb64c2bce891600542" exitCode=2 Apr 16 22:35:48.189996 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:48.189570 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" event={"ID":"75940772-1803-4b20-a926-4128e5d5deb5","Type":"ContainerDied","Data":"1c2a3feeb688956394a24cf2eef8213bbc7bc4f3532e0edb64c2bce891600542"} Apr 16 22:35:48.191048 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:48.191025 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" event={"ID":"9a618200-841c-40c4-a676-e4845f4aed14","Type":"ContainerStarted","Data":"42174975f0ffe8b80bb4f439ca62d539834be7ced77e63fea61a5d55e3eb0b57"} Apr 16 22:35:48.191166 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:48.191054 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" event={"ID":"9a618200-841c-40c4-a676-e4845f4aed14","Type":"ContainerStarted","Data":"46f58c3e8d53d7896e190e1ff8fa50b26f5ebb5395e987ae476dbfb350d8435f"} Apr 16 22:35:48.191166 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:48.191065 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" event={"ID":"9a618200-841c-40c4-a676-e4845f4aed14","Type":"ContainerStarted","Data":"99ff579124b98ff4bd45a5697254358ce2daebdf9fef15766a017588e6e39c59"} Apr 16 22:35:48.191166 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:48.191157 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" Apr 16 22:35:48.209302 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:48.209260 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" podStartSLOduration=1.209246022 podStartE2EDuration="1.209246022s" podCreationTimestamp="2026-04-16 22:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:35:48.207505476 +0000 UTC m=+1335.450321549" watchObservedRunningTime="2026-04-16 22:35:48.209246022 +0000 UTC m=+1335.452062085" Apr 16 22:35:49.194157 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:49.194118 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" Apr 16 22:35:49.195259 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:49.195230 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" podUID="9a618200-841c-40c4-a676-e4845f4aed14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 22:35:50.197266 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:50.197230 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" podUID="9a618200-841c-40c4-a676-e4845f4aed14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 22:35:50.439906 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:50.439882 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" Apr 16 22:35:50.555987 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:50.555874 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmdgx\" (UniqueName: \"kubernetes.io/projected/75940772-1803-4b20-a926-4128e5d5deb5-kube-api-access-mmdgx\") pod \"75940772-1803-4b20-a926-4128e5d5deb5\" (UID: \"75940772-1803-4b20-a926-4128e5d5deb5\") " Apr 16 22:35:50.555987 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:50.555961 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-1148f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/75940772-1803-4b20-a926-4128e5d5deb5-success-200-isvc-1148f-kube-rbac-proxy-sar-config\") pod \"75940772-1803-4b20-a926-4128e5d5deb5\" (UID: \"75940772-1803-4b20-a926-4128e5d5deb5\") " Apr 16 22:35:50.556209 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:50.556024 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75940772-1803-4b20-a926-4128e5d5deb5-proxy-tls\") pod \"75940772-1803-4b20-a926-4128e5d5deb5\" (UID: \"75940772-1803-4b20-a926-4128e5d5deb5\") " Apr 16 22:35:50.556350 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:50.556330 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75940772-1803-4b20-a926-4128e5d5deb5-success-200-isvc-1148f-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-1148f-kube-rbac-proxy-sar-config") pod "75940772-1803-4b20-a926-4128e5d5deb5" (UID: "75940772-1803-4b20-a926-4128e5d5deb5"). InnerVolumeSpecName "success-200-isvc-1148f-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:35:50.558092 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:50.558069 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75940772-1803-4b20-a926-4128e5d5deb5-kube-api-access-mmdgx" (OuterVolumeSpecName: "kube-api-access-mmdgx") pod "75940772-1803-4b20-a926-4128e5d5deb5" (UID: "75940772-1803-4b20-a926-4128e5d5deb5"). InnerVolumeSpecName "kube-api-access-mmdgx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:35:50.558168 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:50.558083 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75940772-1803-4b20-a926-4128e5d5deb5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "75940772-1803-4b20-a926-4128e5d5deb5" (UID: "75940772-1803-4b20-a926-4128e5d5deb5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:35:50.657004 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:50.656975 2574 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-1148f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/75940772-1803-4b20-a926-4128e5d5deb5-success-200-isvc-1148f-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:35:50.657004 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:50.657001 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75940772-1803-4b20-a926-4128e5d5deb5-proxy-tls\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:35:50.657004 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:50.657012 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mmdgx\" (UniqueName: \"kubernetes.io/projected/75940772-1803-4b20-a926-4128e5d5deb5-kube-api-access-mmdgx\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:35:51.070910 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:51.070881 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" Apr 16 22:35:51.200504 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:51.200467 2574 generic.go:358] "Generic (PLEG): container finished" podID="75940772-1803-4b20-a926-4128e5d5deb5" containerID="4cb5d8d2a2e464a7fc3ed844cb29c827362efd92c3d6bf310e27ae49ce99e5e3" exitCode=0 Apr 16 22:35:51.200960 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:51.200520 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" event={"ID":"75940772-1803-4b20-a926-4128e5d5deb5","Type":"ContainerDied","Data":"4cb5d8d2a2e464a7fc3ed844cb29c827362efd92c3d6bf310e27ae49ce99e5e3"} Apr 16 22:35:51.200960 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:51.200551 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" event={"ID":"75940772-1803-4b20-a926-4128e5d5deb5","Type":"ContainerDied","Data":"c38f31a40e9f0445d0d65cea392440f5083cbd66fecd8b7b2764e265526e0021"} Apr 16 22:35:51.200960 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:51.200571 2574 scope.go:117] "RemoveContainer" containerID="1c2a3feeb688956394a24cf2eef8213bbc7bc4f3532e0edb64c2bce891600542" Apr 16 22:35:51.200960 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:51.200552 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl" Apr 16 22:35:51.208615 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:51.208595 2574 scope.go:117] "RemoveContainer" containerID="4cb5d8d2a2e464a7fc3ed844cb29c827362efd92c3d6bf310e27ae49ce99e5e3" Apr 16 22:35:51.215234 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:51.215215 2574 scope.go:117] "RemoveContainer" containerID="1c2a3feeb688956394a24cf2eef8213bbc7bc4f3532e0edb64c2bce891600542" Apr 16 22:35:51.215464 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:35:51.215445 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c2a3feeb688956394a24cf2eef8213bbc7bc4f3532e0edb64c2bce891600542\": container with ID starting with 1c2a3feeb688956394a24cf2eef8213bbc7bc4f3532e0edb64c2bce891600542 not found: ID does not exist" containerID="1c2a3feeb688956394a24cf2eef8213bbc7bc4f3532e0edb64c2bce891600542" Apr 16 22:35:51.215539 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:51.215479 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c2a3feeb688956394a24cf2eef8213bbc7bc4f3532e0edb64c2bce891600542"} err="failed to get container status \"1c2a3feeb688956394a24cf2eef8213bbc7bc4f3532e0edb64c2bce891600542\": rpc error: code = NotFound desc = could not find container \"1c2a3feeb688956394a24cf2eef8213bbc7bc4f3532e0edb64c2bce891600542\": container with ID starting with 1c2a3feeb688956394a24cf2eef8213bbc7bc4f3532e0edb64c2bce891600542 not found: ID does not exist" Apr 16 22:35:51.215539 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:51.215506 2574 scope.go:117] "RemoveContainer" containerID="4cb5d8d2a2e464a7fc3ed844cb29c827362efd92c3d6bf310e27ae49ce99e5e3" Apr 16 22:35:51.215735 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:35:51.215718 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cb5d8d2a2e464a7fc3ed844cb29c827362efd92c3d6bf310e27ae49ce99e5e3\": container with ID starting with 4cb5d8d2a2e464a7fc3ed844cb29c827362efd92c3d6bf310e27ae49ce99e5e3 not found: ID does not exist" containerID="4cb5d8d2a2e464a7fc3ed844cb29c827362efd92c3d6bf310e27ae49ce99e5e3" Apr 16 22:35:51.215773 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:51.215741 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cb5d8d2a2e464a7fc3ed844cb29c827362efd92c3d6bf310e27ae49ce99e5e3"} err="failed to get container status \"4cb5d8d2a2e464a7fc3ed844cb29c827362efd92c3d6bf310e27ae49ce99e5e3\": rpc error: code = NotFound desc = could not find container \"4cb5d8d2a2e464a7fc3ed844cb29c827362efd92c3d6bf310e27ae49ce99e5e3\": container with ID starting with 4cb5d8d2a2e464a7fc3ed844cb29c827362efd92c3d6bf310e27ae49ce99e5e3 not found: ID does not exist" Apr 16 22:35:51.220108 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:51.220087 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl"] Apr 16 22:35:51.223410 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:51.223388 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1148f-predictor-7477c489b6-228hl"] Apr 16 22:35:51.449205 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:51.449175 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75940772-1803-4b20-a926-4128e5d5deb5" path="/var/lib/kubelet/pods/75940772-1803-4b20-a926-4128e5d5deb5/volumes" Apr 16 22:35:55.202108 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:55.202081 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" Apr 16 22:35:55.202639 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:35:55.202614 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" podUID="9a618200-841c-40c4-a676-e4845f4aed14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 22:36:05.202702 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:05.202660 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" podUID="9a618200-841c-40c4-a676-e4845f4aed14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 22:36:13.036227 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.036153 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9"] Apr 16 22:36:13.037086 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.037033 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" podUID="16dd29ae-14fc-440c-95c1-b4ff1758bd52" containerName="kserve-container" containerID="cri-o://6b12ff915ee575dcd6c28e698266d372dc8e2f769f96e5934d0675a15c441c4f" gracePeriod=30 Apr 16 22:36:13.037586 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.037260 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" podUID="16dd29ae-14fc-440c-95c1-b4ff1758bd52" containerName="kube-rbac-proxy" containerID="cri-o://df0f65957d7981faa6162670a302342f396a480077a26a5e8dd761a80b40d82f" gracePeriod=30 Apr 16 22:36:13.063489 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.063466 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2"] Apr 16 22:36:13.063751 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.063739 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75940772-1803-4b20-a926-4128e5d5deb5" containerName="kserve-container" Apr 16 22:36:13.063794 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.063753 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="75940772-1803-4b20-a926-4128e5d5deb5" containerName="kserve-container" Apr 16 22:36:13.063794 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.063768 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75940772-1803-4b20-a926-4128e5d5deb5" containerName="kube-rbac-proxy" Apr 16 22:36:13.063794 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.063774 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="75940772-1803-4b20-a926-4128e5d5deb5" containerName="kube-rbac-proxy" Apr 16 22:36:13.063885 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.063820 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="75940772-1803-4b20-a926-4128e5d5deb5" containerName="kube-rbac-proxy" Apr 16 22:36:13.063885 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.063831 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="75940772-1803-4b20-a926-4128e5d5deb5" containerName="kserve-container" Apr 16 22:36:13.068378 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.068362 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" Apr 16 22:36:13.070745 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.070727 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-c905c-predictor-serving-cert\"" Apr 16 22:36:13.070820 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.070784 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-c905c-kube-rbac-proxy-sar-config\"" Apr 16 22:36:13.077237 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.077216 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2"] Apr 16 22:36:13.232563 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.232528 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d11feab4-f9b3-4d15-9bae-a22c7a9b73f6-proxy-tls\") pod \"success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2\" (UID: \"d11feab4-f9b3-4d15-9bae-a22c7a9b73f6\") " pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" Apr 16 22:36:13.232713 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.232598 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-c905c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d11feab4-f9b3-4d15-9bae-a22c7a9b73f6-success-200-isvc-c905c-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2\" (UID: \"d11feab4-f9b3-4d15-9bae-a22c7a9b73f6\") " pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" Apr 16 22:36:13.232713 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.232699 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-975b2\" (UniqueName: \"kubernetes.io/projected/d11feab4-f9b3-4d15-9bae-a22c7a9b73f6-kube-api-access-975b2\") pod \"success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2\" (UID: \"d11feab4-f9b3-4d15-9bae-a22c7a9b73f6\") " pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" Apr 16 22:36:13.264283 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.264254 2574 generic.go:358] "Generic (PLEG): container finished" podID="16dd29ae-14fc-440c-95c1-b4ff1758bd52" containerID="df0f65957d7981faa6162670a302342f396a480077a26a5e8dd761a80b40d82f" exitCode=2 Apr 16 22:36:13.264407 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.264324 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" event={"ID":"16dd29ae-14fc-440c-95c1-b4ff1758bd52","Type":"ContainerDied","Data":"df0f65957d7981faa6162670a302342f396a480077a26a5e8dd761a80b40d82f"} Apr 16 22:36:13.333537 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.333460 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d11feab4-f9b3-4d15-9bae-a22c7a9b73f6-proxy-tls\") pod \"success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2\" (UID: \"d11feab4-f9b3-4d15-9bae-a22c7a9b73f6\") " pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" Apr 16 22:36:13.333537 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.333503 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-c905c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d11feab4-f9b3-4d15-9bae-a22c7a9b73f6-success-200-isvc-c905c-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2\" (UID: \"d11feab4-f9b3-4d15-9bae-a22c7a9b73f6\") " pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" Apr 16 22:36:13.333743 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.333556 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-975b2\" (UniqueName: \"kubernetes.io/projected/d11feab4-f9b3-4d15-9bae-a22c7a9b73f6-kube-api-access-975b2\") pod \"success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2\" (UID: \"d11feab4-f9b3-4d15-9bae-a22c7a9b73f6\") " pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" Apr 16 22:36:13.334287 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.334267 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-c905c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d11feab4-f9b3-4d15-9bae-a22c7a9b73f6-success-200-isvc-c905c-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2\" (UID: \"d11feab4-f9b3-4d15-9bae-a22c7a9b73f6\") " pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" Apr 16 22:36:13.336383 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.336358 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d11feab4-f9b3-4d15-9bae-a22c7a9b73f6-proxy-tls\") pod \"success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2\" (UID: \"d11feab4-f9b3-4d15-9bae-a22c7a9b73f6\") " pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" Apr 16 22:36:13.341180 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.341153 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-975b2\" (UniqueName: \"kubernetes.io/projected/d11feab4-f9b3-4d15-9bae-a22c7a9b73f6-kube-api-access-975b2\") pod \"success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2\" (UID: \"d11feab4-f9b3-4d15-9bae-a22c7a9b73f6\") " pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" Apr 16 22:36:13.379760 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.379733 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" Apr 16 22:36:13.492460 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:13.492430 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2"] Apr 16 22:36:13.495521 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:36:13.495495 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd11feab4_f9b3_4d15_9bae_a22c7a9b73f6.slice/crio-0c1e6f81549e53e37df03ae6538114e727ed9fb7a0c758fe9c088003b972a104 WatchSource:0}: Error finding container 0c1e6f81549e53e37df03ae6538114e727ed9fb7a0c758fe9c088003b972a104: Status 404 returned error can't find the container with id 0c1e6f81549e53e37df03ae6538114e727ed9fb7a0c758fe9c088003b972a104 Apr 16 22:36:14.268713 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:14.268675 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" event={"ID":"d11feab4-f9b3-4d15-9bae-a22c7a9b73f6","Type":"ContainerStarted","Data":"351424b61ed82135a17f22d092255ce2ed846d80596aa2dc1d6ae36954c09e8d"} Apr 16 22:36:14.268713 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:14.268714 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" event={"ID":"d11feab4-f9b3-4d15-9bae-a22c7a9b73f6","Type":"ContainerStarted","Data":"adf32a423c9aa1fd5c41c57128bd8ea59cb2f299527df6970fe6ae756c5d43bf"} Apr 16 22:36:14.269181 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:14.268726 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" event={"ID":"d11feab4-f9b3-4d15-9bae-a22c7a9b73f6","Type":"ContainerStarted","Data":"0c1e6f81549e53e37df03ae6538114e727ed9fb7a0c758fe9c088003b972a104"} Apr 16 22:36:14.269181 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:14.268845 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" Apr 16 22:36:14.285729 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:14.285681 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" podStartSLOduration=1.285669965 podStartE2EDuration="1.285669965s" podCreationTimestamp="2026-04-16 22:36:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:36:14.283740577 +0000 UTC m=+1361.526556665" watchObservedRunningTime="2026-04-16 22:36:14.285669965 +0000 UTC m=+1361.528486026" Apr 16 22:36:15.203009 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:15.202972 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" podUID="9a618200-841c-40c4-a676-e4845f4aed14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 22:36:15.271208 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:15.271175 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" Apr 16 22:36:15.272497 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:15.272473 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" podUID="d11feab4-f9b3-4d15-9bae-a22c7a9b73f6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 22:36:15.987405 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:15.987386 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" Apr 16 22:36:16.157616 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:16.157589 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16dd29ae-14fc-440c-95c1-b4ff1758bd52-proxy-tls\") pod \"16dd29ae-14fc-440c-95c1-b4ff1758bd52\" (UID: \"16dd29ae-14fc-440c-95c1-b4ff1758bd52\") " Apr 16 22:36:16.157784 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:16.157646 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-cf933-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/16dd29ae-14fc-440c-95c1-b4ff1758bd52-success-200-isvc-cf933-kube-rbac-proxy-sar-config\") pod \"16dd29ae-14fc-440c-95c1-b4ff1758bd52\" (UID: \"16dd29ae-14fc-440c-95c1-b4ff1758bd52\") " Apr 16 22:36:16.157784 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:16.157699 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbmnz\" (UniqueName: \"kubernetes.io/projected/16dd29ae-14fc-440c-95c1-b4ff1758bd52-kube-api-access-fbmnz\") pod \"16dd29ae-14fc-440c-95c1-b4ff1758bd52\" (UID: \"16dd29ae-14fc-440c-95c1-b4ff1758bd52\") " Apr 16 22:36:16.157982 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:16.157956 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16dd29ae-14fc-440c-95c1-b4ff1758bd52-success-200-isvc-cf933-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-cf933-kube-rbac-proxy-sar-config") pod "16dd29ae-14fc-440c-95c1-b4ff1758bd52" (UID: "16dd29ae-14fc-440c-95c1-b4ff1758bd52"). InnerVolumeSpecName "success-200-isvc-cf933-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:36:16.159648 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:16.159626 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16dd29ae-14fc-440c-95c1-b4ff1758bd52-kube-api-access-fbmnz" (OuterVolumeSpecName: "kube-api-access-fbmnz") pod "16dd29ae-14fc-440c-95c1-b4ff1758bd52" (UID: "16dd29ae-14fc-440c-95c1-b4ff1758bd52"). InnerVolumeSpecName "kube-api-access-fbmnz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:36:16.159730 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:16.159649 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16dd29ae-14fc-440c-95c1-b4ff1758bd52-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "16dd29ae-14fc-440c-95c1-b4ff1758bd52" (UID: "16dd29ae-14fc-440c-95c1-b4ff1758bd52"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:36:16.258438 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:16.258395 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16dd29ae-14fc-440c-95c1-b4ff1758bd52-proxy-tls\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:36:16.258438 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:16.258435 2574 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-cf933-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/16dd29ae-14fc-440c-95c1-b4ff1758bd52-success-200-isvc-cf933-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:36:16.258438 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:16.258446 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fbmnz\" (UniqueName: \"kubernetes.io/projected/16dd29ae-14fc-440c-95c1-b4ff1758bd52-kube-api-access-fbmnz\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:36:16.274531 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:16.274487 2574 generic.go:358] "Generic (PLEG): container finished" podID="16dd29ae-14fc-440c-95c1-b4ff1758bd52" containerID="6b12ff915ee575dcd6c28e698266d372dc8e2f769f96e5934d0675a15c441c4f" exitCode=0 Apr 16 22:36:16.274885 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:16.274534 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" event={"ID":"16dd29ae-14fc-440c-95c1-b4ff1758bd52","Type":"ContainerDied","Data":"6b12ff915ee575dcd6c28e698266d372dc8e2f769f96e5934d0675a15c441c4f"} Apr 16 22:36:16.274885 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:16.274573 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" event={"ID":"16dd29ae-14fc-440c-95c1-b4ff1758bd52","Type":"ContainerDied","Data":"0f0d72323a63b12e13e6498e3f7a8c85b30915da55c8f5dcedb578649551b5db"} Apr 16 22:36:16.274885 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:16.274573 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9" Apr 16 22:36:16.274885 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:16.274587 2574 scope.go:117] "RemoveContainer" containerID="df0f65957d7981faa6162670a302342f396a480077a26a5e8dd761a80b40d82f" Apr 16 22:36:16.275120 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:16.275039 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" podUID="d11feab4-f9b3-4d15-9bae-a22c7a9b73f6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 22:36:16.282678 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:16.282660 2574 scope.go:117] "RemoveContainer" containerID="6b12ff915ee575dcd6c28e698266d372dc8e2f769f96e5934d0675a15c441c4f" Apr 16 22:36:16.289563 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:16.289547 2574 scope.go:117] "RemoveContainer" containerID="df0f65957d7981faa6162670a302342f396a480077a26a5e8dd761a80b40d82f" Apr 16 22:36:16.289813 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:36:16.289796 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df0f65957d7981faa6162670a302342f396a480077a26a5e8dd761a80b40d82f\": container with ID starting with df0f65957d7981faa6162670a302342f396a480077a26a5e8dd761a80b40d82f not found: ID does not exist" containerID="df0f65957d7981faa6162670a302342f396a480077a26a5e8dd761a80b40d82f" Apr 16 22:36:16.289859 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:16.289821 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df0f65957d7981faa6162670a302342f396a480077a26a5e8dd761a80b40d82f"} err="failed to get container status \"df0f65957d7981faa6162670a302342f396a480077a26a5e8dd761a80b40d82f\": rpc error: code = NotFound desc = could not find container \"df0f65957d7981faa6162670a302342f396a480077a26a5e8dd761a80b40d82f\": container with ID starting with df0f65957d7981faa6162670a302342f396a480077a26a5e8dd761a80b40d82f not found: ID does not exist" Apr 16 22:36:16.289859 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:16.289839 2574 scope.go:117] "RemoveContainer" containerID="6b12ff915ee575dcd6c28e698266d372dc8e2f769f96e5934d0675a15c441c4f" Apr 16 22:36:16.290065 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:36:16.290047 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b12ff915ee575dcd6c28e698266d372dc8e2f769f96e5934d0675a15c441c4f\": container with ID starting with 6b12ff915ee575dcd6c28e698266d372dc8e2f769f96e5934d0675a15c441c4f not found: ID does not exist" containerID="6b12ff915ee575dcd6c28e698266d372dc8e2f769f96e5934d0675a15c441c4f" Apr 16 22:36:16.290115 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:16.290071 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b12ff915ee575dcd6c28e698266d372dc8e2f769f96e5934d0675a15c441c4f"} err="failed to get container status \"6b12ff915ee575dcd6c28e698266d372dc8e2f769f96e5934d0675a15c441c4f\": rpc error: code = NotFound desc = could not find container \"6b12ff915ee575dcd6c28e698266d372dc8e2f769f96e5934d0675a15c441c4f\": container with ID starting with 6b12ff915ee575dcd6c28e698266d372dc8e2f769f96e5934d0675a15c441c4f not found: ID does not exist" Apr 16 22:36:16.296854 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:16.296833 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9"] Apr 16 22:36:16.300978 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:16.300956 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cf933-predictor-c8d746ff8-62qm9"] Apr 16 22:36:17.448163 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:17.448135 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16dd29ae-14fc-440c-95c1-b4ff1758bd52" path="/var/lib/kubelet/pods/16dd29ae-14fc-440c-95c1-b4ff1758bd52/volumes" Apr 16 22:36:21.279839 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:21.279811 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" Apr 16 22:36:21.280406 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:21.280378 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" podUID="d11feab4-f9b3-4d15-9bae-a22c7a9b73f6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 22:36:25.203262 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:25.203221 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" podUID="9a618200-841c-40c4-a676-e4845f4aed14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 22:36:31.280911 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:31.280870 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" podUID="d11feab4-f9b3-4d15-9bae-a22c7a9b73f6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 22:36:35.203087 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:35.203058 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" Apr 16 22:36:41.280772 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:41.280733 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" podUID="d11feab4-f9b3-4d15-9bae-a22c7a9b73f6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 22:36:51.280836 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:36:51.280792 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" podUID="d11feab4-f9b3-4d15-9bae-a22c7a9b73f6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 22:37:01.281545 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:37:01.281511 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" Apr 16 22:38:33.401105 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:38:33.400992 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 22:38:33.404069 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:38:33.403661 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 22:43:33.419677 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:43:33.419561 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 22:43:33.423907 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:43:33.422475 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 22:45:02.039030 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.038953 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww"] Apr 16 22:45:02.039530 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.039224 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" podUID="9a618200-841c-40c4-a676-e4845f4aed14" containerName="kserve-container" containerID="cri-o://46f58c3e8d53d7896e190e1ff8fa50b26f5ebb5395e987ae476dbfb350d8435f" gracePeriod=30 Apr 16 22:45:02.039530 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.039281 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" podUID="9a618200-841c-40c4-a676-e4845f4aed14" containerName="kube-rbac-proxy" containerID="cri-o://42174975f0ffe8b80bb4f439ca62d539834be7ced77e63fea61a5d55e3eb0b57" gracePeriod=30 Apr 16 22:45:02.136504 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.136469 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld"] Apr 16 22:45:02.136750 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.136738 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16dd29ae-14fc-440c-95c1-b4ff1758bd52" containerName="kube-rbac-proxy" Apr 16 22:45:02.136797 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.136751 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="16dd29ae-14fc-440c-95c1-b4ff1758bd52" containerName="kube-rbac-proxy" Apr 16 22:45:02.136797 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.136759 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16dd29ae-14fc-440c-95c1-b4ff1758bd52" containerName="kserve-container" Apr 16 22:45:02.136797 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.136765 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="16dd29ae-14fc-440c-95c1-b4ff1758bd52" containerName="kserve-container" Apr 16 22:45:02.136895 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.136812 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="16dd29ae-14fc-440c-95c1-b4ff1758bd52" containerName="kserve-container" Apr 16 22:45:02.136895 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.136821 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="16dd29ae-14fc-440c-95c1-b4ff1758bd52" containerName="kube-rbac-proxy" Apr 16 22:45:02.139698 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.139679 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" Apr 16 22:45:02.141970 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.141947 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-a5b4b-kube-rbac-proxy-sar-config\"" Apr 16 22:45:02.142090 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.142002 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-a5b4b-predictor-serving-cert\"" Apr 16 22:45:02.160517 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.160486 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld"] Apr 16 22:45:02.201804 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.201773 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-a5b4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f047fe29-c47c-49f8-9d46-d154dd5bbf6c-success-200-isvc-a5b4b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a5b4b-predictor-69f688f4c8-798ld\" (UID: \"f047fe29-c47c-49f8-9d46-d154dd5bbf6c\") " pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" Apr 16 22:45:02.201962 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.201822 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jtqn\" (UniqueName: \"kubernetes.io/projected/f047fe29-c47c-49f8-9d46-d154dd5bbf6c-kube-api-access-9jtqn\") pod \"success-200-isvc-a5b4b-predictor-69f688f4c8-798ld\" (UID: \"f047fe29-c47c-49f8-9d46-d154dd5bbf6c\") " pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" Apr 16 22:45:02.201962 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.201869 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f047fe29-c47c-49f8-9d46-d154dd5bbf6c-proxy-tls\") pod \"success-200-isvc-a5b4b-predictor-69f688f4c8-798ld\" (UID: \"f047fe29-c47c-49f8-9d46-d154dd5bbf6c\") " pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" Apr 16 22:45:02.303157 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.303072 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-a5b4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f047fe29-c47c-49f8-9d46-d154dd5bbf6c-success-200-isvc-a5b4b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a5b4b-predictor-69f688f4c8-798ld\" (UID: \"f047fe29-c47c-49f8-9d46-d154dd5bbf6c\") " pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" Apr 16 22:45:02.303157 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.303118 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jtqn\" (UniqueName: \"kubernetes.io/projected/f047fe29-c47c-49f8-9d46-d154dd5bbf6c-kube-api-access-9jtqn\") pod \"success-200-isvc-a5b4b-predictor-69f688f4c8-798ld\" (UID: \"f047fe29-c47c-49f8-9d46-d154dd5bbf6c\") " pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" Apr 16 22:45:02.303157 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.303151 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f047fe29-c47c-49f8-9d46-d154dd5bbf6c-proxy-tls\") pod \"success-200-isvc-a5b4b-predictor-69f688f4c8-798ld\" (UID: \"f047fe29-c47c-49f8-9d46-d154dd5bbf6c\") " pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" Apr 16 22:45:02.303772 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.303746 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-a5b4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f047fe29-c47c-49f8-9d46-d154dd5bbf6c-success-200-isvc-a5b4b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a5b4b-predictor-69f688f4c8-798ld\" (UID: \"f047fe29-c47c-49f8-9d46-d154dd5bbf6c\") " pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" Apr 16 22:45:02.305546 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.305527 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f047fe29-c47c-49f8-9d46-d154dd5bbf6c-proxy-tls\") pod \"success-200-isvc-a5b4b-predictor-69f688f4c8-798ld\" (UID: \"f047fe29-c47c-49f8-9d46-d154dd5bbf6c\") " pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" Apr 16 22:45:02.311379 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.311355 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jtqn\" (UniqueName: \"kubernetes.io/projected/f047fe29-c47c-49f8-9d46-d154dd5bbf6c-kube-api-access-9jtqn\") pod \"success-200-isvc-a5b4b-predictor-69f688f4c8-798ld\" (UID: \"f047fe29-c47c-49f8-9d46-d154dd5bbf6c\") " pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" Apr 16 22:45:02.449753 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.449725 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" Apr 16 22:45:02.573772 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.573690 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld"] Apr 16 22:45:02.576567 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:45:02.576542 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf047fe29_c47c_49f8_9d46_d154dd5bbf6c.slice/crio-b9959958d52c0bd38beb65d81a3647dfc2fbe682aeaacc2208dd238ae577a9da WatchSource:0}: Error finding container b9959958d52c0bd38beb65d81a3647dfc2fbe682aeaacc2208dd238ae577a9da: Status 404 returned error can't find the container with id b9959958d52c0bd38beb65d81a3647dfc2fbe682aeaacc2208dd238ae577a9da Apr 16 22:45:02.578179 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.578164 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:45:02.702779 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.702754 2574 generic.go:358] "Generic (PLEG): container finished" podID="9a618200-841c-40c4-a676-e4845f4aed14" containerID="42174975f0ffe8b80bb4f439ca62d539834be7ced77e63fea61a5d55e3eb0b57" exitCode=2 Apr 16 22:45:02.702885 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.702817 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" event={"ID":"9a618200-841c-40c4-a676-e4845f4aed14","Type":"ContainerDied","Data":"42174975f0ffe8b80bb4f439ca62d539834be7ced77e63fea61a5d55e3eb0b57"} Apr 16 22:45:02.704336 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.704316 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" event={"ID":"f047fe29-c47c-49f8-9d46-d154dd5bbf6c","Type":"ContainerStarted","Data":"98dd79ee2ca0a84d1c00eb039eb145db71348d822ca6b85d7973732b41241038"} Apr 16 22:45:02.704430 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.704345 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" event={"ID":"f047fe29-c47c-49f8-9d46-d154dd5bbf6c","Type":"ContainerStarted","Data":"32b2572f92e986358e0e3ce98cae33ab9ba810d29373415b2b7b76be245dcc9c"} Apr 16 22:45:02.704430 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.704360 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" event={"ID":"f047fe29-c47c-49f8-9d46-d154dd5bbf6c","Type":"ContainerStarted","Data":"b9959958d52c0bd38beb65d81a3647dfc2fbe682aeaacc2208dd238ae577a9da"} Apr 16 22:45:02.704507 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.704449 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" Apr 16 22:45:02.724127 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:02.724087 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" podStartSLOduration=0.724074344 podStartE2EDuration="724.074344ms" podCreationTimestamp="2026-04-16 22:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:45:02.723420296 +0000 UTC m=+1889.966236367" watchObservedRunningTime="2026-04-16 22:45:02.724074344 +0000 UTC m=+1889.966890405" Apr 16 22:45:03.706659 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:03.706625 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" Apr 16 22:45:03.708236 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:03.708211 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" podUID="f047fe29-c47c-49f8-9d46-d154dd5bbf6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 22:45:04.710045 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:04.710007 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" podUID="f047fe29-c47c-49f8-9d46-d154dd5bbf6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 22:45:05.198311 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.198270 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" podUID="9a618200-841c-40c4-a676-e4845f4aed14" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.26:8643/healthz\": dial tcp 10.133.0.26:8643: connect: connection refused" Apr 16 22:45:05.202592 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.202565 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" podUID="9a618200-841c-40c4-a676-e4845f4aed14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 22:45:05.283775 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.283740 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" Apr 16 22:45:05.329437 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.329412 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a618200-841c-40c4-a676-e4845f4aed14-proxy-tls\") pod \"9a618200-841c-40c4-a676-e4845f4aed14\" (UID: \"9a618200-841c-40c4-a676-e4845f4aed14\") " Apr 16 22:45:05.329562 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.329495 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s88tf\" (UniqueName: \"kubernetes.io/projected/9a618200-841c-40c4-a676-e4845f4aed14-kube-api-access-s88tf\") pod \"9a618200-841c-40c4-a676-e4845f4aed14\" (UID: \"9a618200-841c-40c4-a676-e4845f4aed14\") " Apr 16 22:45:05.329562 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.329537 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-ea3bc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a618200-841c-40c4-a676-e4845f4aed14-success-200-isvc-ea3bc-kube-rbac-proxy-sar-config\") pod \"9a618200-841c-40c4-a676-e4845f4aed14\" (UID: \"9a618200-841c-40c4-a676-e4845f4aed14\") " Apr 16 22:45:05.329916 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.329887 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a618200-841c-40c4-a676-e4845f4aed14-success-200-isvc-ea3bc-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-ea3bc-kube-rbac-proxy-sar-config") pod "9a618200-841c-40c4-a676-e4845f4aed14" (UID: "9a618200-841c-40c4-a676-e4845f4aed14"). InnerVolumeSpecName "success-200-isvc-ea3bc-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:45:05.331506 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.331484 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a618200-841c-40c4-a676-e4845f4aed14-kube-api-access-s88tf" (OuterVolumeSpecName: "kube-api-access-s88tf") pod "9a618200-841c-40c4-a676-e4845f4aed14" (UID: "9a618200-841c-40c4-a676-e4845f4aed14"). InnerVolumeSpecName "kube-api-access-s88tf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:45:05.331771 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.331748 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a618200-841c-40c4-a676-e4845f4aed14-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9a618200-841c-40c4-a676-e4845f4aed14" (UID: "9a618200-841c-40c4-a676-e4845f4aed14"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:45:05.430967 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.430871 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s88tf\" (UniqueName: \"kubernetes.io/projected/9a618200-841c-40c4-a676-e4845f4aed14-kube-api-access-s88tf\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:45:05.430967 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.430900 2574 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-ea3bc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a618200-841c-40c4-a676-e4845f4aed14-success-200-isvc-ea3bc-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:45:05.430967 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.430913 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a618200-841c-40c4-a676-e4845f4aed14-proxy-tls\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:45:05.713379 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.713289 2574 generic.go:358] "Generic (PLEG): container finished" podID="9a618200-841c-40c4-a676-e4845f4aed14" containerID="46f58c3e8d53d7896e190e1ff8fa50b26f5ebb5395e987ae476dbfb350d8435f" exitCode=0 Apr 16 22:45:05.713379 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.713363 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" Apr 16 22:45:05.713863 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.713378 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" event={"ID":"9a618200-841c-40c4-a676-e4845f4aed14","Type":"ContainerDied","Data":"46f58c3e8d53d7896e190e1ff8fa50b26f5ebb5395e987ae476dbfb350d8435f"} Apr 16 22:45:05.713863 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.713423 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww" event={"ID":"9a618200-841c-40c4-a676-e4845f4aed14","Type":"ContainerDied","Data":"99ff579124b98ff4bd45a5697254358ce2daebdf9fef15766a017588e6e39c59"} Apr 16 22:45:05.713863 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.713443 2574 scope.go:117] "RemoveContainer" containerID="42174975f0ffe8b80bb4f439ca62d539834be7ced77e63fea61a5d55e3eb0b57" Apr 16 22:45:05.721097 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.721083 2574 scope.go:117] "RemoveContainer" containerID="46f58c3e8d53d7896e190e1ff8fa50b26f5ebb5395e987ae476dbfb350d8435f" Apr 16 22:45:05.727967 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.727925 2574 scope.go:117] "RemoveContainer" containerID="42174975f0ffe8b80bb4f439ca62d539834be7ced77e63fea61a5d55e3eb0b57" Apr 16 22:45:05.728229 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:45:05.728209 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42174975f0ffe8b80bb4f439ca62d539834be7ced77e63fea61a5d55e3eb0b57\": container with ID starting with 42174975f0ffe8b80bb4f439ca62d539834be7ced77e63fea61a5d55e3eb0b57 not found: ID does not exist" containerID="42174975f0ffe8b80bb4f439ca62d539834be7ced77e63fea61a5d55e3eb0b57" Apr 16 22:45:05.728314 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.728240 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42174975f0ffe8b80bb4f439ca62d539834be7ced77e63fea61a5d55e3eb0b57"} err="failed to get container status \"42174975f0ffe8b80bb4f439ca62d539834be7ced77e63fea61a5d55e3eb0b57\": rpc error: code = NotFound desc = could not find container \"42174975f0ffe8b80bb4f439ca62d539834be7ced77e63fea61a5d55e3eb0b57\": container with ID starting with 42174975f0ffe8b80bb4f439ca62d539834be7ced77e63fea61a5d55e3eb0b57 not found: ID does not exist" Apr 16 22:45:05.728314 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.728264 2574 scope.go:117] "RemoveContainer" containerID="46f58c3e8d53d7896e190e1ff8fa50b26f5ebb5395e987ae476dbfb350d8435f" Apr 16 22:45:05.728532 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:45:05.728510 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46f58c3e8d53d7896e190e1ff8fa50b26f5ebb5395e987ae476dbfb350d8435f\": container with ID starting with 46f58c3e8d53d7896e190e1ff8fa50b26f5ebb5395e987ae476dbfb350d8435f not found: ID does not exist" containerID="46f58c3e8d53d7896e190e1ff8fa50b26f5ebb5395e987ae476dbfb350d8435f" Apr 16 22:45:05.728573 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.728542 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46f58c3e8d53d7896e190e1ff8fa50b26f5ebb5395e987ae476dbfb350d8435f"} err="failed to get container status \"46f58c3e8d53d7896e190e1ff8fa50b26f5ebb5395e987ae476dbfb350d8435f\": rpc error: code = NotFound desc = could not find container \"46f58c3e8d53d7896e190e1ff8fa50b26f5ebb5395e987ae476dbfb350d8435f\": container with ID starting with 46f58c3e8d53d7896e190e1ff8fa50b26f5ebb5395e987ae476dbfb350d8435f not found: ID does not exist" Apr 16 22:45:05.728925 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.728908 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww"] Apr 16 22:45:05.732539 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:05.732514 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ea3bc-predictor-5c55999dbb-67zww"] Apr 16 22:45:07.448013 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:07.447980 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a618200-841c-40c4-a676-e4845f4aed14" path="/var/lib/kubelet/pods/9a618200-841c-40c4-a676-e4845f4aed14/volumes" Apr 16 22:45:09.714269 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:09.714239 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" Apr 16 22:45:09.714802 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:09.714775 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" podUID="f047fe29-c47c-49f8-9d46-d154dd5bbf6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 22:45:19.715286 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:19.715245 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" podUID="f047fe29-c47c-49f8-9d46-d154dd5bbf6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 22:45:27.829247 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:27.829215 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2"] Apr 16 22:45:27.829741 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:27.829503 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" podUID="d11feab4-f9b3-4d15-9bae-a22c7a9b73f6" containerName="kserve-container" containerID="cri-o://adf32a423c9aa1fd5c41c57128bd8ea59cb2f299527df6970fe6ae756c5d43bf" gracePeriod=30 Apr 16 22:45:27.829741 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:27.829591 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" podUID="d11feab4-f9b3-4d15-9bae-a22c7a9b73f6" containerName="kube-rbac-proxy" containerID="cri-o://351424b61ed82135a17f22d092255ce2ed846d80596aa2dc1d6ae36954c09e8d" gracePeriod=30 Apr 16 22:45:27.888704 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:27.888677 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw"] Apr 16 22:45:27.889038 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:27.889024 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a618200-841c-40c4-a676-e4845f4aed14" containerName="kserve-container" Apr 16 22:45:27.889038 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:27.889039 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a618200-841c-40c4-a676-e4845f4aed14" containerName="kserve-container" Apr 16 22:45:27.889163 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:27.889056 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a618200-841c-40c4-a676-e4845f4aed14" containerName="kube-rbac-proxy" Apr 16 22:45:27.889163 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:27.889065 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a618200-841c-40c4-a676-e4845f4aed14" containerName="kube-rbac-proxy" Apr 16 22:45:27.889163 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:27.889113 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a618200-841c-40c4-a676-e4845f4aed14" containerName="kserve-container" Apr 16 22:45:27.889163 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:27.889121 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a618200-841c-40c4-a676-e4845f4aed14" containerName="kube-rbac-proxy" Apr 16 22:45:27.891982 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:27.891961 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" Apr 16 22:45:27.896697 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:27.896679 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-a9d3d-predictor-serving-cert\"" Apr 16 22:45:27.896961 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:27.896919 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-a9d3d-kube-rbac-proxy-sar-config\"" Apr 16 22:45:27.905522 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:27.905500 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw"] Apr 16 22:45:27.992977 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:27.992947 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb2b33e5-3848-4e5d-b4eb-4c7a830d9384-proxy-tls\") pod \"success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw\" (UID: \"fb2b33e5-3848-4e5d-b4eb-4c7a830d9384\") " pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" Apr 16 22:45:27.993134 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:27.992994 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-a9d3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb2b33e5-3848-4e5d-b4eb-4c7a830d9384-success-200-isvc-a9d3d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw\" (UID: \"fb2b33e5-3848-4e5d-b4eb-4c7a830d9384\") " pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" Apr 16 22:45:27.993134 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:27.993037 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twr55\" (UniqueName: \"kubernetes.io/projected/fb2b33e5-3848-4e5d-b4eb-4c7a830d9384-kube-api-access-twr55\") pod \"success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw\" (UID: \"fb2b33e5-3848-4e5d-b4eb-4c7a830d9384\") " pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" Apr 16 22:45:28.093654 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:28.093579 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb2b33e5-3848-4e5d-b4eb-4c7a830d9384-proxy-tls\") pod \"success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw\" (UID: \"fb2b33e5-3848-4e5d-b4eb-4c7a830d9384\") " pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" Apr 16 22:45:28.093654 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:28.093616 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-a9d3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb2b33e5-3848-4e5d-b4eb-4c7a830d9384-success-200-isvc-a9d3d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw\" (UID: \"fb2b33e5-3848-4e5d-b4eb-4c7a830d9384\") " pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" Apr 16 22:45:28.093654 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:28.093644 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twr55\" (UniqueName: \"kubernetes.io/projected/fb2b33e5-3848-4e5d-b4eb-4c7a830d9384-kube-api-access-twr55\") pod \"success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw\" (UID: \"fb2b33e5-3848-4e5d-b4eb-4c7a830d9384\") " pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" Apr 16 22:45:28.093884 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:45:28.093732 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-serving-cert: secret "success-200-isvc-a9d3d-predictor-serving-cert" not found Apr 16 22:45:28.093884 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:45:28.093803 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb2b33e5-3848-4e5d-b4eb-4c7a830d9384-proxy-tls podName:fb2b33e5-3848-4e5d-b4eb-4c7a830d9384 nodeName:}" failed. No retries permitted until 2026-04-16 22:45:28.593787363 +0000 UTC m=+1915.836603402 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/fb2b33e5-3848-4e5d-b4eb-4c7a830d9384-proxy-tls") pod "success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" (UID: "fb2b33e5-3848-4e5d-b4eb-4c7a830d9384") : secret "success-200-isvc-a9d3d-predictor-serving-cert" not found Apr 16 22:45:28.094285 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:28.094266 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-a9d3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb2b33e5-3848-4e5d-b4eb-4c7a830d9384-success-200-isvc-a9d3d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw\" (UID: \"fb2b33e5-3848-4e5d-b4eb-4c7a830d9384\") " pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" Apr 16 22:45:28.106273 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:28.106253 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twr55\" (UniqueName: \"kubernetes.io/projected/fb2b33e5-3848-4e5d-b4eb-4c7a830d9384-kube-api-access-twr55\") pod \"success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw\" (UID: \"fb2b33e5-3848-4e5d-b4eb-4c7a830d9384\") " pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" Apr 16 22:45:28.598373 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:28.598331 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb2b33e5-3848-4e5d-b4eb-4c7a830d9384-proxy-tls\") pod \"success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw\" (UID: \"fb2b33e5-3848-4e5d-b4eb-4c7a830d9384\") " pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" Apr 16 22:45:28.600522 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:28.600504 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb2b33e5-3848-4e5d-b4eb-4c7a830d9384-proxy-tls\") pod \"success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw\" (UID: \"fb2b33e5-3848-4e5d-b4eb-4c7a830d9384\") " pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" Apr 16 22:45:28.777286 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:28.777250 2574 generic.go:358] "Generic (PLEG): container finished" podID="d11feab4-f9b3-4d15-9bae-a22c7a9b73f6" containerID="351424b61ed82135a17f22d092255ce2ed846d80596aa2dc1d6ae36954c09e8d" exitCode=2 Apr 16 22:45:28.777286 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:28.777290 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" event={"ID":"d11feab4-f9b3-4d15-9bae-a22c7a9b73f6","Type":"ContainerDied","Data":"351424b61ed82135a17f22d092255ce2ed846d80596aa2dc1d6ae36954c09e8d"} Apr 16 22:45:28.802439 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:28.802410 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" Apr 16 22:45:28.921285 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:28.921254 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw"] Apr 16 22:45:28.924004 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:45:28.923973 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb2b33e5_3848_4e5d_b4eb_4c7a830d9384.slice/crio-5b125894a1590e2bb5a9253fdda5486eb25d1c747c0f44e14e88c435e7c41f4c WatchSource:0}: Error finding container 5b125894a1590e2bb5a9253fdda5486eb25d1c747c0f44e14e88c435e7c41f4c: Status 404 returned error can't find the container with id 5b125894a1590e2bb5a9253fdda5486eb25d1c747c0f44e14e88c435e7c41f4c Apr 16 22:45:29.715046 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:29.715009 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" podUID="f047fe29-c47c-49f8-9d46-d154dd5bbf6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 22:45:29.781841 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:29.781799 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" event={"ID":"fb2b33e5-3848-4e5d-b4eb-4c7a830d9384","Type":"ContainerStarted","Data":"165953ad1bab9434d3f39295e1026fd39732fe7ea99f7a9532ad82868484fff7"} Apr 16 22:45:29.781841 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:29.781837 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" event={"ID":"fb2b33e5-3848-4e5d-b4eb-4c7a830d9384","Type":"ContainerStarted","Data":"2656e5c941c93d1a7e18441f0a55419a6171d476ffa53846021f2d50c0a2a304"} Apr 16 22:45:29.782078 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:29.781850 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" event={"ID":"fb2b33e5-3848-4e5d-b4eb-4c7a830d9384","Type":"ContainerStarted","Data":"5b125894a1590e2bb5a9253fdda5486eb25d1c747c0f44e14e88c435e7c41f4c"} Apr 16 22:45:29.782078 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:29.782043 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" Apr 16 22:45:29.782200 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:29.782183 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" Apr 16 22:45:29.783394 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:29.783369 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" podUID="fb2b33e5-3848-4e5d-b4eb-4c7a830d9384" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 22:45:29.799273 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:29.799229 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" podStartSLOduration=2.7992182420000002 podStartE2EDuration="2.799218242s" podCreationTimestamp="2026-04-16 22:45:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:45:29.798846373 +0000 UTC m=+1917.041662433" watchObservedRunningTime="2026-04-16 22:45:29.799218242 +0000 UTC m=+1917.042034302" Apr 16 22:45:30.785738 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:30.785689 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" podUID="fb2b33e5-3848-4e5d-b4eb-4c7a830d9384" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 22:45:31.067185 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:31.067159 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" Apr 16 22:45:31.120487 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:31.120457 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-c905c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d11feab4-f9b3-4d15-9bae-a22c7a9b73f6-success-200-isvc-c905c-kube-rbac-proxy-sar-config\") pod \"d11feab4-f9b3-4d15-9bae-a22c7a9b73f6\" (UID: \"d11feab4-f9b3-4d15-9bae-a22c7a9b73f6\") " Apr 16 22:45:31.120636 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:31.120492 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d11feab4-f9b3-4d15-9bae-a22c7a9b73f6-proxy-tls\") pod \"d11feab4-f9b3-4d15-9bae-a22c7a9b73f6\" (UID: \"d11feab4-f9b3-4d15-9bae-a22c7a9b73f6\") " Apr 16 22:45:31.120636 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:31.120544 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-975b2\" (UniqueName: \"kubernetes.io/projected/d11feab4-f9b3-4d15-9bae-a22c7a9b73f6-kube-api-access-975b2\") pod \"d11feab4-f9b3-4d15-9bae-a22c7a9b73f6\" (UID: \"d11feab4-f9b3-4d15-9bae-a22c7a9b73f6\") " Apr 16 22:45:31.120842 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:31.120808 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d11feab4-f9b3-4d15-9bae-a22c7a9b73f6-success-200-isvc-c905c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-c905c-kube-rbac-proxy-sar-config") pod "d11feab4-f9b3-4d15-9bae-a22c7a9b73f6" (UID: "d11feab4-f9b3-4d15-9bae-a22c7a9b73f6"). InnerVolumeSpecName "success-200-isvc-c905c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:45:31.122539 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:31.122516 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11feab4-f9b3-4d15-9bae-a22c7a9b73f6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d11feab4-f9b3-4d15-9bae-a22c7a9b73f6" (UID: "d11feab4-f9b3-4d15-9bae-a22c7a9b73f6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:45:31.122637 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:31.122569 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11feab4-f9b3-4d15-9bae-a22c7a9b73f6-kube-api-access-975b2" (OuterVolumeSpecName: "kube-api-access-975b2") pod "d11feab4-f9b3-4d15-9bae-a22c7a9b73f6" (UID: "d11feab4-f9b3-4d15-9bae-a22c7a9b73f6"). InnerVolumeSpecName "kube-api-access-975b2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:45:31.221064 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:31.221018 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-975b2\" (UniqueName: \"kubernetes.io/projected/d11feab4-f9b3-4d15-9bae-a22c7a9b73f6-kube-api-access-975b2\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:45:31.221064 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:31.221058 2574 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-c905c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d11feab4-f9b3-4d15-9bae-a22c7a9b73f6-success-200-isvc-c905c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:45:31.221064 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:31.221069 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d11feab4-f9b3-4d15-9bae-a22c7a9b73f6-proxy-tls\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:45:31.789158 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:31.789068 2574 generic.go:358] "Generic (PLEG): container finished" podID="d11feab4-f9b3-4d15-9bae-a22c7a9b73f6" containerID="adf32a423c9aa1fd5c41c57128bd8ea59cb2f299527df6970fe6ae756c5d43bf" exitCode=0 Apr 16 22:45:31.789158 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:31.789153 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" Apr 16 22:45:31.789665 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:31.789155 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" event={"ID":"d11feab4-f9b3-4d15-9bae-a22c7a9b73f6","Type":"ContainerDied","Data":"adf32a423c9aa1fd5c41c57128bd8ea59cb2f299527df6970fe6ae756c5d43bf"} Apr 16 22:45:31.789665 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:31.789205 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2" event={"ID":"d11feab4-f9b3-4d15-9bae-a22c7a9b73f6","Type":"ContainerDied","Data":"0c1e6f81549e53e37df03ae6538114e727ed9fb7a0c758fe9c088003b972a104"} Apr 16 22:45:31.789665 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:31.789227 2574 scope.go:117] "RemoveContainer" containerID="351424b61ed82135a17f22d092255ce2ed846d80596aa2dc1d6ae36954c09e8d" Apr 16 22:45:31.796841 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:31.796813 2574 scope.go:117] "RemoveContainer" containerID="adf32a423c9aa1fd5c41c57128bd8ea59cb2f299527df6970fe6ae756c5d43bf" Apr 16 22:45:31.803325 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:31.803297 2574 scope.go:117] "RemoveContainer" containerID="351424b61ed82135a17f22d092255ce2ed846d80596aa2dc1d6ae36954c09e8d" Apr 16 22:45:31.803560 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:45:31.803543 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"351424b61ed82135a17f22d092255ce2ed846d80596aa2dc1d6ae36954c09e8d\": container with ID starting with 351424b61ed82135a17f22d092255ce2ed846d80596aa2dc1d6ae36954c09e8d not found: ID does not exist" containerID="351424b61ed82135a17f22d092255ce2ed846d80596aa2dc1d6ae36954c09e8d" Apr 16 22:45:31.803623 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:31.803583 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"351424b61ed82135a17f22d092255ce2ed846d80596aa2dc1d6ae36954c09e8d"} err="failed to get container status \"351424b61ed82135a17f22d092255ce2ed846d80596aa2dc1d6ae36954c09e8d\": rpc error: code = NotFound desc = could not find container \"351424b61ed82135a17f22d092255ce2ed846d80596aa2dc1d6ae36954c09e8d\": container with ID starting with 351424b61ed82135a17f22d092255ce2ed846d80596aa2dc1d6ae36954c09e8d not found: ID does not exist" Apr 16 22:45:31.803623 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:31.803600 2574 scope.go:117] "RemoveContainer" containerID="adf32a423c9aa1fd5c41c57128bd8ea59cb2f299527df6970fe6ae756c5d43bf" Apr 16 22:45:31.803885 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:45:31.803860 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adf32a423c9aa1fd5c41c57128bd8ea59cb2f299527df6970fe6ae756c5d43bf\": container with ID starting with adf32a423c9aa1fd5c41c57128bd8ea59cb2f299527df6970fe6ae756c5d43bf not found: ID does not exist" containerID="adf32a423c9aa1fd5c41c57128bd8ea59cb2f299527df6970fe6ae756c5d43bf" Apr 16 22:45:31.803979 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:31.803893 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adf32a423c9aa1fd5c41c57128bd8ea59cb2f299527df6970fe6ae756c5d43bf"} err="failed to get container status \"adf32a423c9aa1fd5c41c57128bd8ea59cb2f299527df6970fe6ae756c5d43bf\": rpc error: code = NotFound desc = could not find container \"adf32a423c9aa1fd5c41c57128bd8ea59cb2f299527df6970fe6ae756c5d43bf\": container with ID starting with adf32a423c9aa1fd5c41c57128bd8ea59cb2f299527df6970fe6ae756c5d43bf not found: ID does not exist" Apr 16 22:45:31.805594 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:31.805575 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2"] Apr 16 22:45:31.807587 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:31.807562 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c905c-predictor-5b6dddb7d6-fsjz2"] Apr 16 22:45:33.447745 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:33.447710 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d11feab4-f9b3-4d15-9bae-a22c7a9b73f6" path="/var/lib/kubelet/pods/d11feab4-f9b3-4d15-9bae-a22c7a9b73f6/volumes" Apr 16 22:45:35.790301 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:35.790274 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" Apr 16 22:45:35.790783 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:35.790757 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" podUID="fb2b33e5-3848-4e5d-b4eb-4c7a830d9384" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 22:45:39.715185 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:39.715148 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" podUID="f047fe29-c47c-49f8-9d46-d154dd5bbf6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 22:45:45.791402 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:45.791358 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" podUID="fb2b33e5-3848-4e5d-b4eb-4c7a830d9384" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 22:45:49.715082 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:49.715051 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" Apr 16 22:45:55.791209 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:45:55.791169 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" podUID="fb2b33e5-3848-4e5d-b4eb-4c7a830d9384" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 22:46:05.791157 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:05.791120 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" podUID="fb2b33e5-3848-4e5d-b4eb-4c7a830d9384" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 22:46:12.298118 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.298029 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld"] Apr 16 22:46:12.298569 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.298511 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" podUID="f047fe29-c47c-49f8-9d46-d154dd5bbf6c" containerName="kserve-container" containerID="cri-o://32b2572f92e986358e0e3ce98cae33ab9ba810d29373415b2b7b76be245dcc9c" gracePeriod=30 Apr 16 22:46:12.298641 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.298547 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" podUID="f047fe29-c47c-49f8-9d46-d154dd5bbf6c" containerName="kube-rbac-proxy" containerID="cri-o://98dd79ee2ca0a84d1c00eb039eb145db71348d822ca6b85d7973732b41241038" gracePeriod=30 Apr 16 22:46:12.352548 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.352512 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8"] Apr 16 22:46:12.352869 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.352852 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d11feab4-f9b3-4d15-9bae-a22c7a9b73f6" containerName="kube-rbac-proxy" Apr 16 22:46:12.352921 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.352871 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11feab4-f9b3-4d15-9bae-a22c7a9b73f6" containerName="kube-rbac-proxy" Apr 16 22:46:12.352921 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.352891 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d11feab4-f9b3-4d15-9bae-a22c7a9b73f6" containerName="kserve-container" Apr 16 22:46:12.352921 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.352899 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11feab4-f9b3-4d15-9bae-a22c7a9b73f6" containerName="kserve-container" Apr 16 22:46:12.353046 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.352962 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d11feab4-f9b3-4d15-9bae-a22c7a9b73f6" containerName="kserve-container" Apr 16 22:46:12.353046 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.352971 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d11feab4-f9b3-4d15-9bae-a22c7a9b73f6" containerName="kube-rbac-proxy" Apr 16 22:46:12.356236 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.356213 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" Apr 16 22:46:12.359019 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.358998 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-3d312-predictor-serving-cert\"" Apr 16 22:46:12.359147 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.359028 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-3d312-kube-rbac-proxy-sar-config\"" Apr 16 22:46:12.367956 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.367915 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8"] Apr 16 22:46:12.437965 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.437921 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aaa7f141-99ee-49c9-852a-6a76cf18eb47-proxy-tls\") pod \"success-200-isvc-3d312-predictor-5779fc45b8-2n9h8\" (UID: \"aaa7f141-99ee-49c9-852a-6a76cf18eb47\") " pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" Apr 16 22:46:12.438115 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.437993 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtj54\" (UniqueName: \"kubernetes.io/projected/aaa7f141-99ee-49c9-852a-6a76cf18eb47-kube-api-access-wtj54\") pod \"success-200-isvc-3d312-predictor-5779fc45b8-2n9h8\" (UID: \"aaa7f141-99ee-49c9-852a-6a76cf18eb47\") " pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" Apr 16 22:46:12.438115 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.438024 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-3d312-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/aaa7f141-99ee-49c9-852a-6a76cf18eb47-success-200-isvc-3d312-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-3d312-predictor-5779fc45b8-2n9h8\" (UID: \"aaa7f141-99ee-49c9-852a-6a76cf18eb47\") " pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" Apr 16 22:46:12.538836 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.538794 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtj54\" (UniqueName: \"kubernetes.io/projected/aaa7f141-99ee-49c9-852a-6a76cf18eb47-kube-api-access-wtj54\") pod \"success-200-isvc-3d312-predictor-5779fc45b8-2n9h8\" (UID: \"aaa7f141-99ee-49c9-852a-6a76cf18eb47\") " pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" Apr 16 22:46:12.539042 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.538860 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-3d312-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/aaa7f141-99ee-49c9-852a-6a76cf18eb47-success-200-isvc-3d312-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-3d312-predictor-5779fc45b8-2n9h8\" (UID: \"aaa7f141-99ee-49c9-852a-6a76cf18eb47\") " pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" Apr 16 22:46:12.539042 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.538970 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aaa7f141-99ee-49c9-852a-6a76cf18eb47-proxy-tls\") pod \"success-200-isvc-3d312-predictor-5779fc45b8-2n9h8\" (UID: \"aaa7f141-99ee-49c9-852a-6a76cf18eb47\") " pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" Apr 16 22:46:12.539174 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:46:12.539067 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-3d312-predictor-serving-cert: secret "success-200-isvc-3d312-predictor-serving-cert" not found Apr 16 22:46:12.539174 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:46:12.539153 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaa7f141-99ee-49c9-852a-6a76cf18eb47-proxy-tls podName:aaa7f141-99ee-49c9-852a-6a76cf18eb47 nodeName:}" failed. No retries permitted until 2026-04-16 22:46:13.039131334 +0000 UTC m=+1960.281947394 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/aaa7f141-99ee-49c9-852a-6a76cf18eb47-proxy-tls") pod "success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" (UID: "aaa7f141-99ee-49c9-852a-6a76cf18eb47") : secret "success-200-isvc-3d312-predictor-serving-cert" not found Apr 16 22:46:12.539754 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.539727 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-3d312-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/aaa7f141-99ee-49c9-852a-6a76cf18eb47-success-200-isvc-3d312-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-3d312-predictor-5779fc45b8-2n9h8\" (UID: \"aaa7f141-99ee-49c9-852a-6a76cf18eb47\") " pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" Apr 16 22:46:12.547098 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.547070 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtj54\" (UniqueName: \"kubernetes.io/projected/aaa7f141-99ee-49c9-852a-6a76cf18eb47-kube-api-access-wtj54\") pod \"success-200-isvc-3d312-predictor-5779fc45b8-2n9h8\" (UID: \"aaa7f141-99ee-49c9-852a-6a76cf18eb47\") " pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" Apr 16 22:46:12.905459 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.905423 2574 generic.go:358] "Generic (PLEG): container finished" podID="f047fe29-c47c-49f8-9d46-d154dd5bbf6c" containerID="98dd79ee2ca0a84d1c00eb039eb145db71348d822ca6b85d7973732b41241038" exitCode=2 Apr 16 22:46:12.905633 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:12.905482 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" event={"ID":"f047fe29-c47c-49f8-9d46-d154dd5bbf6c","Type":"ContainerDied","Data":"98dd79ee2ca0a84d1c00eb039eb145db71348d822ca6b85d7973732b41241038"} Apr 16 22:46:13.042086 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:13.042052 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aaa7f141-99ee-49c9-852a-6a76cf18eb47-proxy-tls\") pod \"success-200-isvc-3d312-predictor-5779fc45b8-2n9h8\" (UID: \"aaa7f141-99ee-49c9-852a-6a76cf18eb47\") " pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" Apr 16 22:46:13.044426 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:13.044396 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aaa7f141-99ee-49c9-852a-6a76cf18eb47-proxy-tls\") pod \"success-200-isvc-3d312-predictor-5779fc45b8-2n9h8\" (UID: \"aaa7f141-99ee-49c9-852a-6a76cf18eb47\") " pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" Apr 16 22:46:13.268242 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:13.268150 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" Apr 16 22:46:13.387622 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:13.387416 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8"] Apr 16 22:46:13.390084 ip-10-0-138-191 kubenswrapper[2574]: W0416 22:46:13.390055 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaa7f141_99ee_49c9_852a_6a76cf18eb47.slice/crio-c3a5d9553cc884b3bf48c4f34c2df5eb1430f7ec8cf878c02ccc114627dcb4b1 WatchSource:0}: Error finding container c3a5d9553cc884b3bf48c4f34c2df5eb1430f7ec8cf878c02ccc114627dcb4b1: Status 404 returned error can't find the container with id c3a5d9553cc884b3bf48c4f34c2df5eb1430f7ec8cf878c02ccc114627dcb4b1 Apr 16 22:46:13.909764 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:13.909726 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" event={"ID":"aaa7f141-99ee-49c9-852a-6a76cf18eb47","Type":"ContainerStarted","Data":"0da37e08626e405fc7bd80613e7064684ae8d17b7dcc81e7397baa5bd5e3ee7a"} Apr 16 22:46:13.909962 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:13.909770 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" event={"ID":"aaa7f141-99ee-49c9-852a-6a76cf18eb47","Type":"ContainerStarted","Data":"c7a37ce50c45b0b151a8f4c8205fb6f437980daa9aff2f1405cd096b0d027a19"} Apr 16 22:46:13.909962 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:13.909785 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" event={"ID":"aaa7f141-99ee-49c9-852a-6a76cf18eb47","Type":"ContainerStarted","Data":"c3a5d9553cc884b3bf48c4f34c2df5eb1430f7ec8cf878c02ccc114627dcb4b1"} Apr 16 22:46:13.910076 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:13.909976 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" Apr 16 22:46:13.910076 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:13.910011 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" Apr 16 22:46:13.911185 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:13.911160 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" podUID="aaa7f141-99ee-49c9-852a-6a76cf18eb47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 22:46:13.926635 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:13.926592 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" podStartSLOduration=1.926581034 podStartE2EDuration="1.926581034s" podCreationTimestamp="2026-04-16 22:46:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:46:13.924529357 +0000 UTC m=+1961.167345429" watchObservedRunningTime="2026-04-16 22:46:13.926581034 +0000 UTC m=+1961.169397094" Apr 16 22:46:14.710675 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:14.710633 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" podUID="f047fe29-c47c-49f8-9d46-d154dd5bbf6c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.28:8643/healthz\": dial tcp 10.133.0.28:8643: connect: connection refused" Apr 16 22:46:14.913232 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:14.913190 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" podUID="aaa7f141-99ee-49c9-852a-6a76cf18eb47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 22:46:15.540994 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.540973 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" Apr 16 22:46:15.561054 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.561027 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-a5b4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f047fe29-c47c-49f8-9d46-d154dd5bbf6c-success-200-isvc-a5b4b-kube-rbac-proxy-sar-config\") pod \"f047fe29-c47c-49f8-9d46-d154dd5bbf6c\" (UID: \"f047fe29-c47c-49f8-9d46-d154dd5bbf6c\") " Apr 16 22:46:15.561054 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.561058 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f047fe29-c47c-49f8-9d46-d154dd5bbf6c-proxy-tls\") pod \"f047fe29-c47c-49f8-9d46-d154dd5bbf6c\" (UID: \"f047fe29-c47c-49f8-9d46-d154dd5bbf6c\") " Apr 16 22:46:15.561249 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.561113 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jtqn\" (UniqueName: \"kubernetes.io/projected/f047fe29-c47c-49f8-9d46-d154dd5bbf6c-kube-api-access-9jtqn\") pod \"f047fe29-c47c-49f8-9d46-d154dd5bbf6c\" (UID: \"f047fe29-c47c-49f8-9d46-d154dd5bbf6c\") " Apr 16 22:46:15.561382 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.561358 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f047fe29-c47c-49f8-9d46-d154dd5bbf6c-success-200-isvc-a5b4b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-a5b4b-kube-rbac-proxy-sar-config") pod "f047fe29-c47c-49f8-9d46-d154dd5bbf6c" (UID: "f047fe29-c47c-49f8-9d46-d154dd5bbf6c"). InnerVolumeSpecName "success-200-isvc-a5b4b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:46:15.563245 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.563215 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f047fe29-c47c-49f8-9d46-d154dd5bbf6c-kube-api-access-9jtqn" (OuterVolumeSpecName: "kube-api-access-9jtqn") pod "f047fe29-c47c-49f8-9d46-d154dd5bbf6c" (UID: "f047fe29-c47c-49f8-9d46-d154dd5bbf6c"). InnerVolumeSpecName "kube-api-access-9jtqn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:46:15.563245 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.563222 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f047fe29-c47c-49f8-9d46-d154dd5bbf6c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f047fe29-c47c-49f8-9d46-d154dd5bbf6c" (UID: "f047fe29-c47c-49f8-9d46-d154dd5bbf6c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:46:15.662353 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.662314 2574 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-a5b4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f047fe29-c47c-49f8-9d46-d154dd5bbf6c-success-200-isvc-a5b4b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:46:15.662353 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.662350 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f047fe29-c47c-49f8-9d46-d154dd5bbf6c-proxy-tls\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:46:15.662545 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.662364 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9jtqn\" (UniqueName: \"kubernetes.io/projected/f047fe29-c47c-49f8-9d46-d154dd5bbf6c-kube-api-access-9jtqn\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:46:15.791117 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.791081 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" Apr 16 22:46:15.917186 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.917107 2574 generic.go:358] "Generic (PLEG): container finished" podID="f047fe29-c47c-49f8-9d46-d154dd5bbf6c" containerID="32b2572f92e986358e0e3ce98cae33ab9ba810d29373415b2b7b76be245dcc9c" exitCode=0 Apr 16 22:46:15.917344 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.917199 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" Apr 16 22:46:15.917344 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.917196 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" event={"ID":"f047fe29-c47c-49f8-9d46-d154dd5bbf6c","Type":"ContainerDied","Data":"32b2572f92e986358e0e3ce98cae33ab9ba810d29373415b2b7b76be245dcc9c"} Apr 16 22:46:15.917344 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.917242 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld" event={"ID":"f047fe29-c47c-49f8-9d46-d154dd5bbf6c","Type":"ContainerDied","Data":"b9959958d52c0bd38beb65d81a3647dfc2fbe682aeaacc2208dd238ae577a9da"} Apr 16 22:46:15.917344 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.917262 2574 scope.go:117] "RemoveContainer" containerID="98dd79ee2ca0a84d1c00eb039eb145db71348d822ca6b85d7973732b41241038" Apr 16 22:46:15.925840 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.925822 2574 scope.go:117] "RemoveContainer" containerID="32b2572f92e986358e0e3ce98cae33ab9ba810d29373415b2b7b76be245dcc9c" Apr 16 22:46:15.936086 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.936061 2574 scope.go:117] "RemoveContainer" containerID="98dd79ee2ca0a84d1c00eb039eb145db71348d822ca6b85d7973732b41241038" Apr 16 22:46:15.936360 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:46:15.936334 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98dd79ee2ca0a84d1c00eb039eb145db71348d822ca6b85d7973732b41241038\": container with ID starting with 98dd79ee2ca0a84d1c00eb039eb145db71348d822ca6b85d7973732b41241038 not found: ID does not exist" containerID="98dd79ee2ca0a84d1c00eb039eb145db71348d822ca6b85d7973732b41241038" Apr 16 22:46:15.936448 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.936371 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98dd79ee2ca0a84d1c00eb039eb145db71348d822ca6b85d7973732b41241038"} err="failed to get container status \"98dd79ee2ca0a84d1c00eb039eb145db71348d822ca6b85d7973732b41241038\": rpc error: code = NotFound desc = could not find container \"98dd79ee2ca0a84d1c00eb039eb145db71348d822ca6b85d7973732b41241038\": container with ID starting with 98dd79ee2ca0a84d1c00eb039eb145db71348d822ca6b85d7973732b41241038 not found: ID does not exist" Apr 16 22:46:15.936448 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.936397 2574 scope.go:117] "RemoveContainer" containerID="32b2572f92e986358e0e3ce98cae33ab9ba810d29373415b2b7b76be245dcc9c" Apr 16 22:46:15.936649 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:46:15.936625 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32b2572f92e986358e0e3ce98cae33ab9ba810d29373415b2b7b76be245dcc9c\": container with ID starting with 32b2572f92e986358e0e3ce98cae33ab9ba810d29373415b2b7b76be245dcc9c not found: ID does not exist" containerID="32b2572f92e986358e0e3ce98cae33ab9ba810d29373415b2b7b76be245dcc9c" Apr 16 22:46:15.936705 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.936659 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32b2572f92e986358e0e3ce98cae33ab9ba810d29373415b2b7b76be245dcc9c"} err="failed to get container status \"32b2572f92e986358e0e3ce98cae33ab9ba810d29373415b2b7b76be245dcc9c\": rpc error: code = NotFound desc = could not find container \"32b2572f92e986358e0e3ce98cae33ab9ba810d29373415b2b7b76be245dcc9c\": container with ID starting with 32b2572f92e986358e0e3ce98cae33ab9ba810d29373415b2b7b76be245dcc9c not found: ID does not exist" Apr 16 22:46:15.937171 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.937152 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld"] Apr 16 22:46:15.939044 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:15.939026 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a5b4b-predictor-69f688f4c8-798ld"] Apr 16 22:46:17.448428 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:17.448396 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f047fe29-c47c-49f8-9d46-d154dd5bbf6c" path="/var/lib/kubelet/pods/f047fe29-c47c-49f8-9d46-d154dd5bbf6c/volumes" Apr 16 22:46:19.917318 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:19.917287 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" Apr 16 22:46:19.917818 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:19.917792 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" podUID="aaa7f141-99ee-49c9-852a-6a76cf18eb47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 22:46:29.917860 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:29.917820 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" podUID="aaa7f141-99ee-49c9-852a-6a76cf18eb47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 22:46:39.917788 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:39.917752 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" podUID="aaa7f141-99ee-49c9-852a-6a76cf18eb47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 22:46:49.917829 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:49.917784 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" podUID="aaa7f141-99ee-49c9-852a-6a76cf18eb47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 22:46:59.918425 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:46:59.918397 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" Apr 16 22:48:33.438022 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:48:33.437886 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 22:48:33.441153 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:48:33.441110 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 22:53:33.457168 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:53:33.457066 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 22:53:33.460707 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:53:33.460689 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 22:55:27.143267 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:27.143183 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8"] Apr 16 22:55:27.143911 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:27.143544 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" podUID="aaa7f141-99ee-49c9-852a-6a76cf18eb47" containerName="kserve-container" containerID="cri-o://c7a37ce50c45b0b151a8f4c8205fb6f437980daa9aff2f1405cd096b0d027a19" gracePeriod=30 Apr 16 22:55:27.143911 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:27.143594 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" podUID="aaa7f141-99ee-49c9-852a-6a76cf18eb47" containerName="kube-rbac-proxy" containerID="cri-o://0da37e08626e405fc7bd80613e7064684ae8d17b7dcc81e7397baa5bd5e3ee7a" gracePeriod=30 Apr 16 22:55:27.413764 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:27.413685 2574 generic.go:358] "Generic (PLEG): container finished" podID="aaa7f141-99ee-49c9-852a-6a76cf18eb47" containerID="0da37e08626e405fc7bd80613e7064684ae8d17b7dcc81e7397baa5bd5e3ee7a" exitCode=2 Apr 16 22:55:27.413764 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:27.413747 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" event={"ID":"aaa7f141-99ee-49c9-852a-6a76cf18eb47","Type":"ContainerDied","Data":"0da37e08626e405fc7bd80613e7064684ae8d17b7dcc81e7397baa5bd5e3ee7a"} Apr 16 22:55:29.890132 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:29.890107 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" Apr 16 22:55:29.972290 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:29.972250 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-3d312-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/aaa7f141-99ee-49c9-852a-6a76cf18eb47-success-200-isvc-3d312-kube-rbac-proxy-sar-config\") pod \"aaa7f141-99ee-49c9-852a-6a76cf18eb47\" (UID: \"aaa7f141-99ee-49c9-852a-6a76cf18eb47\") " Apr 16 22:55:29.972463 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:29.972319 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtj54\" (UniqueName: \"kubernetes.io/projected/aaa7f141-99ee-49c9-852a-6a76cf18eb47-kube-api-access-wtj54\") pod \"aaa7f141-99ee-49c9-852a-6a76cf18eb47\" (UID: \"aaa7f141-99ee-49c9-852a-6a76cf18eb47\") " Apr 16 22:55:29.972463 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:29.972367 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aaa7f141-99ee-49c9-852a-6a76cf18eb47-proxy-tls\") pod \"aaa7f141-99ee-49c9-852a-6a76cf18eb47\" (UID: \"aaa7f141-99ee-49c9-852a-6a76cf18eb47\") " Apr 16 22:55:29.972602 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:29.972575 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaa7f141-99ee-49c9-852a-6a76cf18eb47-success-200-isvc-3d312-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-3d312-kube-rbac-proxy-sar-config") pod "aaa7f141-99ee-49c9-852a-6a76cf18eb47" (UID: "aaa7f141-99ee-49c9-852a-6a76cf18eb47"). InnerVolumeSpecName "success-200-isvc-3d312-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:55:29.974379 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:29.974354 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaa7f141-99ee-49c9-852a-6a76cf18eb47-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "aaa7f141-99ee-49c9-852a-6a76cf18eb47" (UID: "aaa7f141-99ee-49c9-852a-6a76cf18eb47"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:55:29.974476 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:29.974354 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaa7f141-99ee-49c9-852a-6a76cf18eb47-kube-api-access-wtj54" (OuterVolumeSpecName: "kube-api-access-wtj54") pod "aaa7f141-99ee-49c9-852a-6a76cf18eb47" (UID: "aaa7f141-99ee-49c9-852a-6a76cf18eb47"). InnerVolumeSpecName "kube-api-access-wtj54". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:55:30.073616 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:30.073519 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wtj54\" (UniqueName: \"kubernetes.io/projected/aaa7f141-99ee-49c9-852a-6a76cf18eb47-kube-api-access-wtj54\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:55:30.073616 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:30.073556 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aaa7f141-99ee-49c9-852a-6a76cf18eb47-proxy-tls\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:55:30.073616 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:30.073572 2574 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-3d312-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/aaa7f141-99ee-49c9-852a-6a76cf18eb47-success-200-isvc-3d312-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 22:55:30.422731 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:30.422701 2574 generic.go:358] "Generic (PLEG): container finished" podID="aaa7f141-99ee-49c9-852a-6a76cf18eb47" containerID="c7a37ce50c45b0b151a8f4c8205fb6f437980daa9aff2f1405cd096b0d027a19" exitCode=0 Apr 16 22:55:30.422912 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:30.422738 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" event={"ID":"aaa7f141-99ee-49c9-852a-6a76cf18eb47","Type":"ContainerDied","Data":"c7a37ce50c45b0b151a8f4c8205fb6f437980daa9aff2f1405cd096b0d027a19"} Apr 16 22:55:30.422912 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:30.422761 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" event={"ID":"aaa7f141-99ee-49c9-852a-6a76cf18eb47","Type":"ContainerDied","Data":"c3a5d9553cc884b3bf48c4f34c2df5eb1430f7ec8cf878c02ccc114627dcb4b1"} Apr 16 22:55:30.422912 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:30.422775 2574 scope.go:117] "RemoveContainer" containerID="0da37e08626e405fc7bd80613e7064684ae8d17b7dcc81e7397baa5bd5e3ee7a" Apr 16 22:55:30.422912 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:30.422774 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8" Apr 16 22:55:30.430566 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:30.430531 2574 scope.go:117] "RemoveContainer" containerID="c7a37ce50c45b0b151a8f4c8205fb6f437980daa9aff2f1405cd096b0d027a19" Apr 16 22:55:30.437264 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:30.437243 2574 scope.go:117] "RemoveContainer" containerID="0da37e08626e405fc7bd80613e7064684ae8d17b7dcc81e7397baa5bd5e3ee7a" Apr 16 22:55:30.437493 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:55:30.437473 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0da37e08626e405fc7bd80613e7064684ae8d17b7dcc81e7397baa5bd5e3ee7a\": container with ID starting with 0da37e08626e405fc7bd80613e7064684ae8d17b7dcc81e7397baa5bd5e3ee7a not found: ID does not exist" containerID="0da37e08626e405fc7bd80613e7064684ae8d17b7dcc81e7397baa5bd5e3ee7a" Apr 16 22:55:30.437543 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:30.437503 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da37e08626e405fc7bd80613e7064684ae8d17b7dcc81e7397baa5bd5e3ee7a"} err="failed to get container status \"0da37e08626e405fc7bd80613e7064684ae8d17b7dcc81e7397baa5bd5e3ee7a\": rpc error: code = NotFound desc = could not find container \"0da37e08626e405fc7bd80613e7064684ae8d17b7dcc81e7397baa5bd5e3ee7a\": container with ID starting with 0da37e08626e405fc7bd80613e7064684ae8d17b7dcc81e7397baa5bd5e3ee7a not found: ID does not exist" Apr 16 22:55:30.437543 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:30.437522 2574 scope.go:117] "RemoveContainer" containerID="c7a37ce50c45b0b151a8f4c8205fb6f437980daa9aff2f1405cd096b0d027a19" Apr 16 22:55:30.437762 ip-10-0-138-191 kubenswrapper[2574]: E0416 22:55:30.437743 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7a37ce50c45b0b151a8f4c8205fb6f437980daa9aff2f1405cd096b0d027a19\": container with ID starting with c7a37ce50c45b0b151a8f4c8205fb6f437980daa9aff2f1405cd096b0d027a19 not found: ID does not exist" containerID="c7a37ce50c45b0b151a8f4c8205fb6f437980daa9aff2f1405cd096b0d027a19" Apr 16 22:55:30.437813 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:30.437770 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7a37ce50c45b0b151a8f4c8205fb6f437980daa9aff2f1405cd096b0d027a19"} err="failed to get container status \"c7a37ce50c45b0b151a8f4c8205fb6f437980daa9aff2f1405cd096b0d027a19\": rpc error: code = NotFound desc = could not find container \"c7a37ce50c45b0b151a8f4c8205fb6f437980daa9aff2f1405cd096b0d027a19\": container with ID starting with c7a37ce50c45b0b151a8f4c8205fb6f437980daa9aff2f1405cd096b0d027a19 not found: ID does not exist" Apr 16 22:55:30.442573 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:30.442554 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8"] Apr 16 22:55:30.446386 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:30.446346 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3d312-predictor-5779fc45b8-2n9h8"] Apr 16 22:55:31.447912 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:55:31.447878 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaa7f141-99ee-49c9-852a-6a76cf18eb47" path="/var/lib/kubelet/pods/aaa7f141-99ee-49c9-852a-6a76cf18eb47/volumes" Apr 16 22:58:33.478489 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:58:33.478387 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 22:58:33.483429 ip-10-0-138-191 kubenswrapper[2574]: I0416 22:58:33.483410 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 23:02:47.365059 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:47.364983 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw"] Apr 16 23:02:47.365541 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:47.365245 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" podUID="fb2b33e5-3848-4e5d-b4eb-4c7a830d9384" containerName="kserve-container" containerID="cri-o://2656e5c941c93d1a7e18441f0a55419a6171d476ffa53846021f2d50c0a2a304" gracePeriod=30 Apr 16 23:02:47.365541 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:47.365271 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" podUID="fb2b33e5-3848-4e5d-b4eb-4c7a830d9384" containerName="kube-rbac-proxy" containerID="cri-o://165953ad1bab9434d3f39295e1026fd39732fe7ea99f7a9532ad82868484fff7" gracePeriod=30 Apr 16 23:02:47.389659 ip-10-0-138-191 kubenswrapper[2574]: E0416 23:02:47.389631 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb2b33e5_3848_4e5d_b4eb_4c7a830d9384.slice/crio-conmon-165953ad1bab9434d3f39295e1026fd39732fe7ea99f7a9532ad82868484fff7.scope\": RecentStats: unable to find data in memory cache]" Apr 16 23:02:47.562598 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:47.562560 2574 generic.go:358] "Generic (PLEG): container finished" podID="fb2b33e5-3848-4e5d-b4eb-4c7a830d9384" containerID="165953ad1bab9434d3f39295e1026fd39732fe7ea99f7a9532ad82868484fff7" exitCode=2 Apr 16 23:02:47.562742 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:47.562607 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" event={"ID":"fb2b33e5-3848-4e5d-b4eb-4c7a830d9384","Type":"ContainerDied","Data":"165953ad1bab9434d3f39295e1026fd39732fe7ea99f7a9532ad82868484fff7"} Apr 16 23:02:50.404325 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:50.404304 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" Apr 16 23:02:50.450569 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:50.450542 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twr55\" (UniqueName: \"kubernetes.io/projected/fb2b33e5-3848-4e5d-b4eb-4c7a830d9384-kube-api-access-twr55\") pod \"fb2b33e5-3848-4e5d-b4eb-4c7a830d9384\" (UID: \"fb2b33e5-3848-4e5d-b4eb-4c7a830d9384\") " Apr 16 23:02:50.450706 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:50.450592 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb2b33e5-3848-4e5d-b4eb-4c7a830d9384-proxy-tls\") pod \"fb2b33e5-3848-4e5d-b4eb-4c7a830d9384\" (UID: \"fb2b33e5-3848-4e5d-b4eb-4c7a830d9384\") " Apr 16 23:02:50.450706 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:50.450614 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-a9d3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb2b33e5-3848-4e5d-b4eb-4c7a830d9384-success-200-isvc-a9d3d-kube-rbac-proxy-sar-config\") pod \"fb2b33e5-3848-4e5d-b4eb-4c7a830d9384\" (UID: \"fb2b33e5-3848-4e5d-b4eb-4c7a830d9384\") " Apr 16 23:02:50.451023 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:50.450980 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb2b33e5-3848-4e5d-b4eb-4c7a830d9384-success-200-isvc-a9d3d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-a9d3d-kube-rbac-proxy-sar-config") pod "fb2b33e5-3848-4e5d-b4eb-4c7a830d9384" (UID: "fb2b33e5-3848-4e5d-b4eb-4c7a830d9384"). InnerVolumeSpecName "success-200-isvc-a9d3d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:02:50.452554 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:50.452530 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb2b33e5-3848-4e5d-b4eb-4c7a830d9384-kube-api-access-twr55" (OuterVolumeSpecName: "kube-api-access-twr55") pod "fb2b33e5-3848-4e5d-b4eb-4c7a830d9384" (UID: "fb2b33e5-3848-4e5d-b4eb-4c7a830d9384"). InnerVolumeSpecName "kube-api-access-twr55". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:02:50.452673 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:50.452561 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb2b33e5-3848-4e5d-b4eb-4c7a830d9384-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fb2b33e5-3848-4e5d-b4eb-4c7a830d9384" (UID: "fb2b33e5-3848-4e5d-b4eb-4c7a830d9384"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:02:50.552105 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:50.552028 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb2b33e5-3848-4e5d-b4eb-4c7a830d9384-proxy-tls\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 23:02:50.552105 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:50.552057 2574 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-a9d3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb2b33e5-3848-4e5d-b4eb-4c7a830d9384-success-200-isvc-a9d3d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 23:02:50.552105 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:50.552068 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-twr55\" (UniqueName: \"kubernetes.io/projected/fb2b33e5-3848-4e5d-b4eb-4c7a830d9384-kube-api-access-twr55\") on node \"ip-10-0-138-191.ec2.internal\" DevicePath \"\"" Apr 16 23:02:50.571278 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:50.571249 2574 generic.go:358] "Generic (PLEG): container finished" podID="fb2b33e5-3848-4e5d-b4eb-4c7a830d9384" containerID="2656e5c941c93d1a7e18441f0a55419a6171d476ffa53846021f2d50c0a2a304" exitCode=0 Apr 16 23:02:50.571410 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:50.571285 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" event={"ID":"fb2b33e5-3848-4e5d-b4eb-4c7a830d9384","Type":"ContainerDied","Data":"2656e5c941c93d1a7e18441f0a55419a6171d476ffa53846021f2d50c0a2a304"} Apr 16 23:02:50.571410 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:50.571309 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" event={"ID":"fb2b33e5-3848-4e5d-b4eb-4c7a830d9384","Type":"ContainerDied","Data":"5b125894a1590e2bb5a9253fdda5486eb25d1c747c0f44e14e88c435e7c41f4c"} Apr 16 23:02:50.571410 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:50.571329 2574 scope.go:117] "RemoveContainer" containerID="165953ad1bab9434d3f39295e1026fd39732fe7ea99f7a9532ad82868484fff7" Apr 16 23:02:50.571410 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:50.571330 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw" Apr 16 23:02:50.581017 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:50.580998 2574 scope.go:117] "RemoveContainer" containerID="2656e5c941c93d1a7e18441f0a55419a6171d476ffa53846021f2d50c0a2a304" Apr 16 23:02:50.587823 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:50.587805 2574 scope.go:117] "RemoveContainer" containerID="165953ad1bab9434d3f39295e1026fd39732fe7ea99f7a9532ad82868484fff7" Apr 16 23:02:50.588073 ip-10-0-138-191 kubenswrapper[2574]: E0416 23:02:50.588051 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"165953ad1bab9434d3f39295e1026fd39732fe7ea99f7a9532ad82868484fff7\": container with ID starting with 165953ad1bab9434d3f39295e1026fd39732fe7ea99f7a9532ad82868484fff7 not found: ID does not exist" containerID="165953ad1bab9434d3f39295e1026fd39732fe7ea99f7a9532ad82868484fff7" Apr 16 23:02:50.588145 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:50.588082 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"165953ad1bab9434d3f39295e1026fd39732fe7ea99f7a9532ad82868484fff7"} err="failed to get container status \"165953ad1bab9434d3f39295e1026fd39732fe7ea99f7a9532ad82868484fff7\": rpc error: code = NotFound desc = could not find container \"165953ad1bab9434d3f39295e1026fd39732fe7ea99f7a9532ad82868484fff7\": container with ID starting with 165953ad1bab9434d3f39295e1026fd39732fe7ea99f7a9532ad82868484fff7 not found: ID does not exist" Apr 16 23:02:50.588145 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:50.588099 2574 scope.go:117] "RemoveContainer" containerID="2656e5c941c93d1a7e18441f0a55419a6171d476ffa53846021f2d50c0a2a304" Apr 16 23:02:50.588305 ip-10-0-138-191 kubenswrapper[2574]: E0416 23:02:50.588290 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2656e5c941c93d1a7e18441f0a55419a6171d476ffa53846021f2d50c0a2a304\": container with ID starting with 2656e5c941c93d1a7e18441f0a55419a6171d476ffa53846021f2d50c0a2a304 not found: ID does not exist" containerID="2656e5c941c93d1a7e18441f0a55419a6171d476ffa53846021f2d50c0a2a304" Apr 16 23:02:50.588346 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:50.588308 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2656e5c941c93d1a7e18441f0a55419a6171d476ffa53846021f2d50c0a2a304"} err="failed to get container status \"2656e5c941c93d1a7e18441f0a55419a6171d476ffa53846021f2d50c0a2a304\": rpc error: code = NotFound desc = could not find container \"2656e5c941c93d1a7e18441f0a55419a6171d476ffa53846021f2d50c0a2a304\": container with ID starting with 2656e5c941c93d1a7e18441f0a55419a6171d476ffa53846021f2d50c0a2a304 not found: ID does not exist" Apr 16 23:02:50.593364 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:50.593344 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw"] Apr 16 23:02:50.598585 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:50.598566 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a9d3d-predictor-5957bc5496-6b4rw"] Apr 16 23:02:51.448025 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:02:51.447992 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb2b33e5-3848-4e5d-b4eb-4c7a830d9384" path="/var/lib/kubelet/pods/fb2b33e5-3848-4e5d-b4eb-4c7a830d9384/volumes" Apr 16 23:03:15.190878 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:15.190838 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-dmn2w_c08f4349-6022-4892-a46d-87843f55329d/global-pull-secret-syncer/0.log" Apr 16 23:03:15.335009 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:15.334972 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-jhsst_0208e93a-b489-404a-8e48-d0d66d76793f/konnectivity-agent/0.log" Apr 16 23:03:15.379286 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:15.379241 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-191.ec2.internal_09a45ec1566c454073ee33f001f99f61/haproxy/0.log" Apr 16 23:03:19.273756 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:19.273726 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-574cb6b97d-cmv9h_1c4166b9-1a37-4b4a-bce9-31afc91645a4/metrics-server/0.log" Apr 16 23:03:19.478281 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:19.478253 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rlbd6_14853ac7-bf90-4539-8a3a-0f4dc64657ce/node-exporter/0.log" Apr 16 23:03:19.499532 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:19.499509 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rlbd6_14853ac7-bf90-4539-8a3a-0f4dc64657ce/kube-rbac-proxy/0.log" Apr 16 23:03:19.517730 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:19.517707 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rlbd6_14853ac7-bf90-4539-8a3a-0f4dc64657ce/init-textfile/0.log" Apr 16 23:03:22.560856 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.560825 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4"] Apr 16 23:03:22.561257 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.561106 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb2b33e5-3848-4e5d-b4eb-4c7a830d9384" containerName="kserve-container" Apr 16 23:03:22.561257 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.561118 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2b33e5-3848-4e5d-b4eb-4c7a830d9384" containerName="kserve-container" Apr 16 23:03:22.561257 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.561130 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aaa7f141-99ee-49c9-852a-6a76cf18eb47" containerName="kube-rbac-proxy" Apr 16 23:03:22.561257 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.561135 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaa7f141-99ee-49c9-852a-6a76cf18eb47" containerName="kube-rbac-proxy" Apr 16 23:03:22.561257 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.561145 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb2b33e5-3848-4e5d-b4eb-4c7a830d9384" containerName="kube-rbac-proxy" Apr 16 23:03:22.561257 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.561150 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2b33e5-3848-4e5d-b4eb-4c7a830d9384" containerName="kube-rbac-proxy" Apr 16 23:03:22.561257 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.561162 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f047fe29-c47c-49f8-9d46-d154dd5bbf6c" containerName="kube-rbac-proxy" Apr 16 23:03:22.561257 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.561167 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f047fe29-c47c-49f8-9d46-d154dd5bbf6c" containerName="kube-rbac-proxy" Apr 16 23:03:22.561257 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.561174 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f047fe29-c47c-49f8-9d46-d154dd5bbf6c" containerName="kserve-container" Apr 16 23:03:22.561257 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.561179 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f047fe29-c47c-49f8-9d46-d154dd5bbf6c" containerName="kserve-container" Apr 16 23:03:22.561257 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.561185 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aaa7f141-99ee-49c9-852a-6a76cf18eb47" containerName="kserve-container" Apr 16 23:03:22.561257 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.561190 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaa7f141-99ee-49c9-852a-6a76cf18eb47" containerName="kserve-container" Apr 16 23:03:22.561257 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.561226 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="aaa7f141-99ee-49c9-852a-6a76cf18eb47" containerName="kube-rbac-proxy" Apr 16 23:03:22.561257 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.561233 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f047fe29-c47c-49f8-9d46-d154dd5bbf6c" containerName="kserve-container" Apr 16 23:03:22.561257 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.561241 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="aaa7f141-99ee-49c9-852a-6a76cf18eb47" containerName="kserve-container" Apr 16 23:03:22.561257 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.561246 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f047fe29-c47c-49f8-9d46-d154dd5bbf6c" containerName="kube-rbac-proxy" Apr 16 23:03:22.561257 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.561252 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb2b33e5-3848-4e5d-b4eb-4c7a830d9384" containerName="kserve-container" Apr 16 23:03:22.561257 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.561259 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb2b33e5-3848-4e5d-b4eb-4c7a830d9384" containerName="kube-rbac-proxy" Apr 16 23:03:22.564047 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.564031 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4" Apr 16 23:03:22.566371 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.566345 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vzm9p\"/\"kube-root-ca.crt\"" Apr 16 23:03:22.566515 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.566346 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vzm9p\"/\"default-dockercfg-bgrbr\"" Apr 16 23:03:22.566515 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.566418 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vzm9p\"/\"openshift-service-ca.crt\"" Apr 16 23:03:22.575135 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.575107 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4"] Apr 16 23:03:22.688153 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.688121 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/14b77586-975f-468f-ad8c-cc682c574636-proc\") pod \"perf-node-gather-daemonset-xhxp4\" (UID: \"14b77586-975f-468f-ad8c-cc682c574636\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4" Apr 16 23:03:22.688153 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.688152 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/14b77586-975f-468f-ad8c-cc682c574636-podres\") pod \"perf-node-gather-daemonset-xhxp4\" (UID: \"14b77586-975f-468f-ad8c-cc682c574636\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4" Apr 16 23:03:22.688362 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.688176 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrs6g\" (UniqueName: \"kubernetes.io/projected/14b77586-975f-468f-ad8c-cc682c574636-kube-api-access-nrs6g\") pod \"perf-node-gather-daemonset-xhxp4\" (UID: \"14b77586-975f-468f-ad8c-cc682c574636\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4" Apr 16 23:03:22.688362 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.688261 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14b77586-975f-468f-ad8c-cc682c574636-sys\") pod \"perf-node-gather-daemonset-xhxp4\" (UID: \"14b77586-975f-468f-ad8c-cc682c574636\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4" Apr 16 23:03:22.688362 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.688315 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/14b77586-975f-468f-ad8c-cc682c574636-lib-modules\") pod \"perf-node-gather-daemonset-xhxp4\" (UID: \"14b77586-975f-468f-ad8c-cc682c574636\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4" Apr 16 23:03:22.789621 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.789586 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/14b77586-975f-468f-ad8c-cc682c574636-lib-modules\") pod \"perf-node-gather-daemonset-xhxp4\" (UID: \"14b77586-975f-468f-ad8c-cc682c574636\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4" Apr 16 23:03:22.789805 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.789635 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/14b77586-975f-468f-ad8c-cc682c574636-proc\") pod \"perf-node-gather-daemonset-xhxp4\" (UID: \"14b77586-975f-468f-ad8c-cc682c574636\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4" Apr 16 23:03:22.789805 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.789662 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/14b77586-975f-468f-ad8c-cc682c574636-podres\") pod \"perf-node-gather-daemonset-xhxp4\" (UID: \"14b77586-975f-468f-ad8c-cc682c574636\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4" Apr 16 23:03:22.789805 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.789692 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrs6g\" (UniqueName: \"kubernetes.io/projected/14b77586-975f-468f-ad8c-cc682c574636-kube-api-access-nrs6g\") pod \"perf-node-gather-daemonset-xhxp4\" (UID: \"14b77586-975f-468f-ad8c-cc682c574636\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4" Apr 16 23:03:22.789805 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.789744 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14b77586-975f-468f-ad8c-cc682c574636-sys\") pod \"perf-node-gather-daemonset-xhxp4\" (UID: \"14b77586-975f-468f-ad8c-cc682c574636\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4" Apr 16 23:03:22.789805 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.789750 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/14b77586-975f-468f-ad8c-cc682c574636-lib-modules\") pod \"perf-node-gather-daemonset-xhxp4\" (UID: \"14b77586-975f-468f-ad8c-cc682c574636\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4" Apr 16 23:03:22.789805 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.789778 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/14b77586-975f-468f-ad8c-cc682c574636-podres\") pod \"perf-node-gather-daemonset-xhxp4\" (UID: \"14b77586-975f-468f-ad8c-cc682c574636\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4" Apr 16 23:03:22.789805 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.789752 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/14b77586-975f-468f-ad8c-cc682c574636-proc\") pod \"perf-node-gather-daemonset-xhxp4\" (UID: \"14b77586-975f-468f-ad8c-cc682c574636\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4" Apr 16 23:03:22.790078 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.789920 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14b77586-975f-468f-ad8c-cc682c574636-sys\") pod \"perf-node-gather-daemonset-xhxp4\" (UID: \"14b77586-975f-468f-ad8c-cc682c574636\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4" Apr 16 23:03:22.797904 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.797879 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrs6g\" (UniqueName: \"kubernetes.io/projected/14b77586-975f-468f-ad8c-cc682c574636-kube-api-access-nrs6g\") pod \"perf-node-gather-daemonset-xhxp4\" (UID: \"14b77586-975f-468f-ad8c-cc682c574636\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4" Apr 16 23:03:22.874064 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.874040 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4" Apr 16 23:03:22.988592 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.988560 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4"] Apr 16 23:03:22.991771 ip-10-0-138-191 kubenswrapper[2574]: W0416 23:03:22.991744 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod14b77586_975f_468f_ad8c_cc682c574636.slice/crio-9e0f3a4821d44072fba0d3b34a5ff314ba2d1445c3d96ee87aaab84ddd67f1b1 WatchSource:0}: Error finding container 9e0f3a4821d44072fba0d3b34a5ff314ba2d1445c3d96ee87aaab84ddd67f1b1: Status 404 returned error can't find the container with id 9e0f3a4821d44072fba0d3b34a5ff314ba2d1445c3d96ee87aaab84ddd67f1b1 Apr 16 23:03:22.993374 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:22.993359 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:03:23.093349 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:23.093326 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-k6prk_cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0/dns/0.log" Apr 16 23:03:23.112125 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:23.112108 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-k6prk_cefc3c86-7e52-4b0c-8dfc-08cf48cb79f0/kube-rbac-proxy/0.log" Apr 16 23:03:23.157778 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:23.157711 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-dm8zf_d2e4bc53-aead-430d-aaf8-6def343926ef/dns-node-resolver/0.log" Apr 16 23:03:23.574520 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:23.574438 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7ffcd7d997-ppg7h_9edb9f95-1b7c-4a3d-81f4-b34dc212e4c7/registry/0.log" Apr 16 23:03:23.593356 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:23.593328 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2grf8_9614d8df-9bb5-4a22-a608-e18aa7fb1162/node-ca/0.log" Apr 16 23:03:23.659514 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:23.659480 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4" event={"ID":"14b77586-975f-468f-ad8c-cc682c574636","Type":"ContainerStarted","Data":"122649e5c888fc3dd111221da0a2beef6c5be3c6ea945783dd388aa86e2df4a7"} Apr 16 23:03:23.659682 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:23.659520 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4" event={"ID":"14b77586-975f-468f-ad8c-cc682c574636","Type":"ContainerStarted","Data":"9e0f3a4821d44072fba0d3b34a5ff314ba2d1445c3d96ee87aaab84ddd67f1b1"} Apr 16 23:03:23.659682 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:23.659553 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4" Apr 16 23:03:24.634063 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:24.634036 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8qb8j_92752f12-a0e9-4d82-90c5-3b5beed10ab8/serve-healthcheck-canary/0.log" Apr 16 23:03:25.112578 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:25.112553 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kp5wn_2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0/kube-rbac-proxy/0.log" Apr 16 23:03:25.132350 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:25.132327 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kp5wn_2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0/exporter/0.log" Apr 16 23:03:25.184960 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:25.184920 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kp5wn_2b3b2f9b-76cb-4dab-9ed1-fb642c3531d0/extractor/0.log" Apr 16 23:03:27.238431 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:27.238403 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-84d7d5cfc6-dr69d_3dc8535b-b4c7-4da6-86a0-d78d1243c64d/manager/0.log" Apr 16 23:03:27.258988 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:27.258960 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-5blj5_cb1efc1b-240a-4c14-ac61-623284c6e9f7/manager/0.log" Apr 16 23:03:29.672084 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:29.672052 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4" Apr 16 23:03:29.688973 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:29.688908 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-xhxp4" podStartSLOduration=7.688892049 podStartE2EDuration="7.688892049s" podCreationTimestamp="2026-04-16 23:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:03:23.674886003 +0000 UTC m=+2990.917702064" watchObservedRunningTime="2026-04-16 23:03:29.688892049 +0000 UTC m=+2996.931708418" Apr 16 23:03:32.676582 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:32.676552 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5pd4l_29ce4801-ff31-4651-98b4-aba09699b7b6/kube-multus-additional-cni-plugins/0.log" Apr 16 23:03:32.697177 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:32.697151 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5pd4l_29ce4801-ff31-4651-98b4-aba09699b7b6/egress-router-binary-copy/0.log" Apr 16 23:03:32.720329 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:32.720306 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5pd4l_29ce4801-ff31-4651-98b4-aba09699b7b6/cni-plugins/0.log" Apr 16 23:03:32.742620 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:32.742602 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5pd4l_29ce4801-ff31-4651-98b4-aba09699b7b6/bond-cni-plugin/0.log" Apr 16 23:03:32.762672 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:32.762648 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5pd4l_29ce4801-ff31-4651-98b4-aba09699b7b6/routeoverride-cni/0.log" Apr 16 23:03:32.782997 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:32.782973 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5pd4l_29ce4801-ff31-4651-98b4-aba09699b7b6/whereabouts-cni-bincopy/0.log" Apr 16 23:03:32.803480 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:32.803461 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5pd4l_29ce4801-ff31-4651-98b4-aba09699b7b6/whereabouts-cni/0.log" Apr 16 23:03:33.234382 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:33.234308 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wjxwh_6aaeb270-2bd7-4647-889b-36ff3ceba5cf/kube-multus/0.log" Apr 16 23:03:33.259906 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:33.259878 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-knqmk_cc045530-7e0f-412e-98ba-915fe7aa6d22/network-metrics-daemon/0.log" Apr 16 23:03:33.280770 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:33.280747 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-knqmk_cc045530-7e0f-412e-98ba-915fe7aa6d22/kube-rbac-proxy/0.log" Apr 16 23:03:33.495841 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:33.495812 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 23:03:33.500801 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:33.500782 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 23:03:34.721504 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:34.721479 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-controller/0.log" Apr 16 23:03:34.740282 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:34.740255 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/0.log" Apr 16 23:03:34.753324 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:34.753296 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovn-acl-logging/1.log" Apr 16 23:03:34.771818 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:34.771794 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/kube-rbac-proxy-node/0.log" Apr 16 23:03:34.792657 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:34.792633 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 23:03:34.810845 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:34.810827 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/northd/0.log" Apr 16 23:03:34.835374 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:34.835348 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/nbdb/0.log" Apr 16 23:03:34.854960 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:34.854939 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/sbdb/0.log" Apr 16 23:03:34.956880 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:34.956850 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z2ds7_6970b263-be92-460e-92da-a049f7bdbafe/ovnkube-controller/0.log" Apr 16 23:03:35.917343 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:35.917313 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-4g7hv_06497fd8-f35d-4fd4-b42b-13ff6ded57e8/network-check-target-container/0.log" Apr 16 23:03:36.778992 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:36.778959 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-gssjh_cc0a50b2-d73b-40da-a946-11e81bed8282/iptables-alerter/0.log" Apr 16 23:03:37.434337 ip-10-0-138-191 kubenswrapper[2574]: I0416 23:03:37.434308 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-lhbpr_32b9b798-09e2-4502-8f94-c5f194be68e3/tuned/0.log"