Apr 28 19:16:13.577707 ip-10-0-143-206 systemd[1]: Starting Kubernetes Kubelet... Apr 28 19:16:14.034022 ip-10-0-143-206 kubenswrapper[2539]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:16:14.034022 ip-10-0-143-206 kubenswrapper[2539]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 28 19:16:14.034022 ip-10-0-143-206 kubenswrapper[2539]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:16:14.034022 ip-10-0-143-206 kubenswrapper[2539]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 28 19:16:14.034022 ip-10-0-143-206 kubenswrapper[2539]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:16:14.036743 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.036652 2539 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 28 19:16:14.046133 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046102 2539 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:14.046133 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046126 2539 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:14.046133 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046130 2539 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:14.046133 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046134 2539 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:14.046133 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046138 2539 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:14.046133 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046141 2539 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:14.046133 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046145 2539 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:14.046434 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046148 2539 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:14.046434 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046151 2539 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:14.046434 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046154 2539 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:14.046434 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046156 2539 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:14.046434 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046159 2539 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:14.046434 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046164 2539 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:14.046434 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046169 2539 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:14.046434 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046172 2539 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:14.046434 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046175 2539 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:14.046434 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046178 2539 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:14.046434 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046181 2539 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:14.046434 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046184 2539 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:14.046434 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046186 2539 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:14.046434 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046189 2539 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:14.046434 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046192 2539 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:14.046434 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046195 2539 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:14.046434 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046198 2539 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:14.046434 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046200 2539 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:14.046434 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046204 2539 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:14.046434 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046206 2539 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:14.046960 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046209 2539 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:14.046960 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046212 2539 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:14.046960 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046215 2539 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:14.046960 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046217 2539 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:14.046960 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046220 2539 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:14.046960 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046222 2539 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:14.046960 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046225 2539 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:14.046960 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046227 2539 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:14.046960 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046230 2539 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:14.046960 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046232 2539 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:14.046960 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046235 2539 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:14.046960 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046237 2539 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:14.046960 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046240 2539 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:14.046960 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046244 2539 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:14.046960 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046247 2539 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:14.046960 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046249 2539 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:14.046960 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046252 2539 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:14.046960 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046255 2539 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:14.046960 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046257 2539 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:14.046960 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046260 2539 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:14.047496 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046263 2539 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:14.047496 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046266 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:14.047496 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046268 2539 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:14.047496 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046271 2539 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:14.047496 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046273 2539 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:14.047496 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046276 2539 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:14.047496 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046278 2539 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:14.047496 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046281 2539 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:14.047496 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046283 2539 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:14.047496 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046286 2539 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:14.047496 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046288 2539 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:14.047496 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046291 2539 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:14.047496 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046294 2539 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:14.047496 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046296 2539 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:14.047496 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046298 2539 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:14.047496 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046301 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:14.047496 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046304 2539 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:14.047496 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046307 2539 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:14.047496 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046309 2539 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:14.047496 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046311 2539 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:14.048003 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046314 2539 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:14.048003 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046316 2539 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:14.048003 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046320 2539 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:14.048003 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046324 2539 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:14.048003 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046327 2539 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:14.048003 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046330 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:14.048003 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046334 2539 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:14.048003 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046336 2539 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:14.048003 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046339 2539 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:14.048003 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046342 2539 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:14.048003 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046345 2539 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:14.048003 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046348 2539 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:14.048003 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046351 2539 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:14.048003 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046354 2539 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:14.048003 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046356 2539 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:14.048003 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046359 2539 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:14.048003 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046362 2539 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:14.048003 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046365 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:14.048003 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046367 2539 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:14.048490 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046804 2539 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:14.048490 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046811 2539 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:14.048490 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046813 2539 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:14.048490 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046816 2539 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:14.048490 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046819 2539 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:14.048490 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046822 2539 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:14.048490 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046825 2539 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:14.048490 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046829 2539 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:14.048490 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046832 2539 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:14.048490 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046835 2539 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:14.048490 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046838 2539 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:14.048490 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046841 2539 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:14.048490 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046844 2539 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:14.048490 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046846 2539 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:14.048490 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046849 2539 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:14.048490 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046851 2539 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:14.048490 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046854 2539 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:14.048490 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046857 2539 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:14.048490 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046861 2539 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:14.048956 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046864 2539 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:14.048956 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046866 2539 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:14.048956 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046869 2539 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:14.048956 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046872 2539 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:14.048956 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046874 2539 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:14.048956 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046877 2539 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:14.048956 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046880 2539 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:14.048956 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046883 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:14.048956 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046885 2539 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:14.048956 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046888 2539 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:14.048956 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046891 2539 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:14.048956 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046893 2539 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:14.048956 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046896 2539 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:14.048956 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046898 2539 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:14.048956 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046901 2539 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:14.048956 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046903 2539 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:14.048956 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046906 2539 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:14.048956 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046908 2539 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:14.048956 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046911 2539 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:14.048956 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046913 2539 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:14.049497 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046916 2539 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:14.049497 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046919 2539 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:14.049497 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046923 2539 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:14.049497 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046927 2539 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:14.049497 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046930 2539 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:14.049497 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046932 2539 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:14.049497 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046935 2539 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:14.049497 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046937 2539 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:14.049497 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046940 2539 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:14.049497 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046942 2539 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:14.049497 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046946 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:14.049497 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046949 2539 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:14.049497 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046952 2539 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:14.049497 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046954 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:14.049497 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046957 2539 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:14.049497 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046959 2539 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:14.049497 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046961 2539 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:14.049497 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046965 2539 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:14.049497 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046968 2539 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:14.049497 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046971 2539 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:14.049986 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046973 2539 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:14.049986 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046976 2539 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:14.049986 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046978 2539 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:14.049986 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046982 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:14.049986 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046984 2539 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:14.049986 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046987 2539 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:14.049986 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046989 2539 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:14.049986 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046992 2539 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:14.049986 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046994 2539 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:14.049986 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046997 2539 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:14.049986 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.046999 2539 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:14.049986 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.047002 2539 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:14.049986 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.047004 2539 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:14.049986 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.047007 2539 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:14.049986 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.047009 2539 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:14.049986 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.047012 2539 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:14.049986 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.047014 2539 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:14.049986 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.047017 2539 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:14.049986 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.047019 2539 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:14.049986 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.047022 2539 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.047024 2539 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.047027 2539 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.047029 2539 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.047032 2539 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.047035 2539 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.047037 2539 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.047040 2539 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.048940 2539 flags.go:64] FLAG: --address="0.0.0.0" Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.048953 2539 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.048968 2539 flags.go:64] FLAG: --anonymous-auth="true" Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.048972 2539 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.048978 2539 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.048982 2539 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.048986 2539 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.048991 2539 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.048995 2539 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.048998 2539 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049001 2539 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049004 2539 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049008 2539 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049011 2539 flags.go:64] FLAG: --cgroup-root="" Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049014 2539 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 28 19:16:14.050519 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049017 2539 flags.go:64] FLAG: --client-ca-file="" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049020 2539 flags.go:64] FLAG: --cloud-config="" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049023 2539 flags.go:64] FLAG: --cloud-provider="external" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049026 2539 flags.go:64] FLAG: --cluster-dns="[]" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049030 2539 flags.go:64] FLAG: --cluster-domain="" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049033 2539 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049036 2539 flags.go:64] FLAG: --config-dir="" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049039 2539 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049043 2539 flags.go:64] FLAG: --container-log-max-files="5" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049047 2539 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049051 2539 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049054 2539 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049057 2539 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049061 2539 flags.go:64] FLAG: --contention-profiling="false" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049064 2539 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049067 2539 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049070 2539 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049074 2539 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049078 2539 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049081 2539 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049084 2539 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049087 2539 flags.go:64] FLAG: --enable-load-reader="false" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049091 2539 flags.go:64] FLAG: --enable-server="true" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049093 2539 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049098 2539 flags.go:64] FLAG: --event-burst="100" Apr 28 19:16:14.051068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049102 2539 flags.go:64] FLAG: --event-qps="50" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049105 2539 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049108 2539 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049111 2539 flags.go:64] FLAG: --eviction-hard="" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049114 2539 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049117 2539 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049120 2539 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049123 2539 flags.go:64] FLAG: --eviction-soft="" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049126 2539 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049129 2539 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049132 2539 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049135 2539 flags.go:64] FLAG: --experimental-mounter-path="" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049138 2539 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049141 2539 flags.go:64] FLAG: --fail-swap-on="true" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049144 2539 flags.go:64] FLAG: --feature-gates="" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049148 2539 flags.go:64] FLAG: --file-check-frequency="20s" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049151 2539 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049154 2539 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049157 2539 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049161 2539 flags.go:64] FLAG: --healthz-port="10248" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049164 2539 flags.go:64] FLAG: --help="false" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049167 2539 flags.go:64] FLAG: --hostname-override="ip-10-0-143-206.ec2.internal" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049170 2539 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049174 2539 flags.go:64] FLAG: --http-check-frequency="20s" Apr 28 19:16:14.051687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049176 2539 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049180 2539 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049183 2539 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049187 2539 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049189 2539 flags.go:64] FLAG: --image-service-endpoint="" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049192 2539 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049196 2539 flags.go:64] FLAG: --kube-api-burst="100" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049199 2539 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049202 2539 flags.go:64] FLAG: --kube-api-qps="50" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049205 2539 flags.go:64] FLAG: --kube-reserved="" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049208 2539 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049210 2539 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049214 2539 flags.go:64] FLAG: --kubelet-cgroups="" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049216 2539 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049219 2539 flags.go:64] FLAG: --lock-file="" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049222 2539 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049225 2539 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049228 2539 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049233 2539 flags.go:64] FLAG: --log-json-split-stream="false" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049236 2539 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049239 2539 flags.go:64] FLAG: --log-text-split-stream="false" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049241 2539 flags.go:64] FLAG: --logging-format="text" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049244 2539 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049248 2539 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 28 19:16:14.052272 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049255 2539 flags.go:64] FLAG: --manifest-url="" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049258 2539 flags.go:64] FLAG: --manifest-url-header="" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049262 2539 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049265 2539 flags.go:64] FLAG: --max-open-files="1000000" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049271 2539 flags.go:64] FLAG: --max-pods="110" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049274 2539 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049277 2539 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049281 2539 flags.go:64] FLAG: --memory-manager-policy="None" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049284 2539 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049287 2539 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049290 2539 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049293 2539 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049301 2539 flags.go:64] FLAG: --node-status-max-images="50" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049305 2539 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049308 2539 flags.go:64] FLAG: --oom-score-adj="-999" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049312 2539 flags.go:64] FLAG: --pod-cidr="" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049316 2539 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049321 2539 flags.go:64] FLAG: --pod-manifest-path="" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049324 2539 flags.go:64] FLAG: --pod-max-pids="-1" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049328 2539 flags.go:64] FLAG: --pods-per-core="0" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049331 2539 flags.go:64] FLAG: --port="10250" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049334 2539 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049337 2539 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02de2a59961d65609" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049340 2539 flags.go:64] FLAG: --qos-reserved="" Apr 28 19:16:14.052854 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049343 2539 flags.go:64] FLAG: --read-only-port="10255" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049346 2539 flags.go:64] FLAG: --register-node="true" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049349 2539 flags.go:64] FLAG: --register-schedulable="true" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049352 2539 flags.go:64] FLAG: --register-with-taints="" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049356 2539 flags.go:64] FLAG: --registry-burst="10" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049359 2539 flags.go:64] FLAG: --registry-qps="5" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049362 2539 flags.go:64] FLAG: --reserved-cpus="" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049365 2539 flags.go:64] FLAG: --reserved-memory="" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049369 2539 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049385 2539 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049389 2539 flags.go:64] FLAG: --rotate-certificates="false" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049392 2539 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049396 2539 flags.go:64] FLAG: --runonce="false" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049399 2539 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049402 2539 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049406 2539 flags.go:64] FLAG: --seccomp-default="false" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049409 2539 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049412 2539 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049415 2539 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049418 2539 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049422 2539 flags.go:64] FLAG: --storage-driver-password="root" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049425 2539 flags.go:64] FLAG: --storage-driver-secure="false" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049428 2539 flags.go:64] FLAG: --storage-driver-table="stats" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049430 2539 flags.go:64] FLAG: --storage-driver-user="root" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049434 2539 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049437 2539 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 28 19:16:14.053483 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049440 2539 flags.go:64] FLAG: --system-cgroups="" Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049443 2539 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049448 2539 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049451 2539 flags.go:64] FLAG: --tls-cert-file="" Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049454 2539 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049458 2539 flags.go:64] FLAG: --tls-min-version="" Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049461 2539 flags.go:64] FLAG: --tls-private-key-file="" Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049464 2539 flags.go:64] FLAG: --topology-manager-policy="none" Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049467 2539 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049470 2539 flags.go:64] FLAG: --topology-manager-scope="container" Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049473 2539 flags.go:64] FLAG: --v="2" Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049477 2539 flags.go:64] FLAG: --version="false" Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049481 2539 flags.go:64] FLAG: --vmodule="" Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049486 2539 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.049489 2539 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049598 2539 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049602 2539 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049605 2539 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049609 2539 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049612 2539 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049614 2539 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049617 2539 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049620 2539 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:14.054122 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049623 2539 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:14.054709 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049626 2539 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:14.054709 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049629 2539 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:14.054709 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049632 2539 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:14.054709 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049635 2539 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:14.054709 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049637 2539 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:14.054709 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049640 2539 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:14.054709 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049643 2539 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:14.054709 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049645 2539 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:14.054709 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049648 2539 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:14.054709 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049651 2539 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:14.054709 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049654 2539 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:14.054709 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049656 2539 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:14.054709 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049659 2539 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:14.054709 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049661 2539 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:14.054709 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049664 2539 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:14.054709 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049666 2539 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:14.054709 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049669 2539 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:14.054709 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049671 2539 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:14.054709 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049674 2539 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:14.055215 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049678 2539 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:14.055215 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049682 2539 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:14.055215 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049685 2539 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:14.055215 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049688 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:14.055215 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049691 2539 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:14.055215 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049694 2539 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:14.055215 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049697 2539 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:14.055215 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049700 2539 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:14.055215 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049702 2539 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:14.055215 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049705 2539 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:14.055215 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049707 2539 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:14.055215 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049710 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:14.055215 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049712 2539 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:14.055215 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049715 2539 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:14.055215 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049717 2539 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:14.055215 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049720 2539 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:14.055215 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049722 2539 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:14.055215 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049725 2539 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:14.055215 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049727 2539 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:14.055215 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049730 2539 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:14.055769 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049732 2539 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:14.055769 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049735 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:14.055769 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049738 2539 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:14.055769 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049741 2539 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:14.055769 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049743 2539 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:14.055769 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049746 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:14.055769 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049749 2539 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:14.055769 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049752 2539 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:14.055769 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049755 2539 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:14.055769 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049759 2539 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:14.055769 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049762 2539 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:14.055769 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049764 2539 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:14.055769 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049767 2539 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:14.055769 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049769 2539 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:14.055769 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049772 2539 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:14.055769 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049774 2539 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:14.055769 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049777 2539 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:14.055769 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049780 2539 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:14.055769 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049782 2539 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:14.055769 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049785 2539 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:14.056261 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049788 2539 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:14.056261 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049790 2539 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:14.056261 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049793 2539 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:14.056261 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049795 2539 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:14.056261 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049798 2539 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:14.056261 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049800 2539 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:14.056261 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049803 2539 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:14.056261 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049805 2539 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:14.056261 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049808 2539 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:14.056261 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049810 2539 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:14.056261 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049813 2539 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:14.056261 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049815 2539 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:14.056261 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049818 2539 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:14.056261 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049821 2539 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:14.056261 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049824 2539 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:14.056261 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049827 2539 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:14.056261 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049829 2539 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:14.056261 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.049832 2539 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:14.056782 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.050667 2539 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:16:14.056844 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.056823 2539 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 28 19:16:14.056885 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.056846 2539 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 28 19:16:14.056919 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056902 2539 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:14.056919 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056908 2539 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:14.056919 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056911 2539 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:14.056919 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056914 2539 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:14.056919 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056917 2539 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:14.056919 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056922 2539 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:14.057070 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056925 2539 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:14.057070 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056928 2539 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:14.057070 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056932 2539 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:14.057070 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056935 2539 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:14.057070 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056937 2539 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:14.057070 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056941 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:14.057070 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056943 2539 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:14.057070 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056946 2539 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:14.057070 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056949 2539 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:14.057070 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056951 2539 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:14.057070 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056954 2539 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:14.057070 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056957 2539 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:14.057070 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056959 2539 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:14.057070 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056962 2539 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:14.057070 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056964 2539 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:14.057070 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056967 2539 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:14.057070 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056970 2539 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:14.057070 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056972 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:14.057070 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056975 2539 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:14.057070 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056977 2539 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:14.057567 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056980 2539 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:14.057567 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056983 2539 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:14.057567 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056985 2539 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:14.057567 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056988 2539 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:14.057567 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056991 2539 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:14.057567 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056995 2539 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:14.057567 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.056999 2539 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:14.057567 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057003 2539 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:14.057567 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057006 2539 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:14.057567 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057009 2539 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:14.057567 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057012 2539 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:14.057567 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057015 2539 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:14.057567 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057018 2539 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:14.057567 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057021 2539 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:14.057567 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057024 2539 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:14.057567 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057026 2539 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:14.057567 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057029 2539 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:14.057567 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057032 2539 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:14.057567 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057034 2539 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:14.057567 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057037 2539 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:14.058040 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057039 2539 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:14.058040 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057042 2539 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:14.058040 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057045 2539 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:14.058040 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057048 2539 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:14.058040 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057050 2539 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:14.058040 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057053 2539 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:14.058040 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057056 2539 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:14.058040 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057059 2539 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:14.058040 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057063 2539 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:14.058040 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057067 2539 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:14.058040 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057070 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:14.058040 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057073 2539 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:14.058040 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057075 2539 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:14.058040 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057078 2539 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:14.058040 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057080 2539 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:14.058040 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057083 2539 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:14.058040 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057086 2539 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:14.058040 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057090 2539 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:14.058040 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057093 2539 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:14.058040 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057096 2539 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:14.058541 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057098 2539 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:14.058541 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057101 2539 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:14.058541 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057103 2539 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:14.058541 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057106 2539 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:14.058541 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057108 2539 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:14.058541 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057111 2539 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:14.058541 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057114 2539 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:14.058541 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057116 2539 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:14.058541 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057118 2539 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:14.058541 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057121 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:14.058541 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057124 2539 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:14.058541 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057126 2539 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:14.058541 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057129 2539 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:14.058541 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057131 2539 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:14.058541 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057133 2539 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:14.058541 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057136 2539 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:14.058541 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057138 2539 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:14.058541 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057141 2539 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:14.058541 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057144 2539 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:14.058541 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057146 2539 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:14.059021 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.057151 2539 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:16:14.059021 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057282 2539 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:14.059021 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057286 2539 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:14.059021 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057289 2539 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:14.059021 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057292 2539 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:14.059021 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057295 2539 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:14.059021 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057298 2539 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:14.059021 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057300 2539 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:14.059021 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057303 2539 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:14.059021 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057307 2539 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:14.059021 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057309 2539 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:14.059021 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057312 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:14.059021 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057315 2539 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:14.059021 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057318 2539 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:14.059021 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057320 2539 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:14.059399 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057323 2539 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:14.059399 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057325 2539 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:14.059399 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057327 2539 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:14.059399 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057330 2539 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:14.059399 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057332 2539 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:14.059399 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057335 2539 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:14.059399 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057337 2539 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:14.059399 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057340 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:14.059399 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057342 2539 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:14.059399 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057345 2539 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:14.059399 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057347 2539 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:14.059399 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057350 2539 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:14.059399 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057353 2539 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:14.059399 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057355 2539 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:14.059399 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057358 2539 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:14.059399 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057361 2539 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:14.059399 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057363 2539 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:14.059399 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057366 2539 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:14.059399 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057382 2539 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:14.059399 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057387 2539 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:14.059916 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057390 2539 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:14.059916 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057393 2539 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:14.059916 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057395 2539 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:14.059916 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057399 2539 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:14.059916 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057401 2539 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:14.059916 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057404 2539 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:14.059916 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057407 2539 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:14.059916 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057410 2539 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:14.059916 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057412 2539 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:14.059916 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057416 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:14.059916 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057418 2539 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:14.059916 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057421 2539 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:14.059916 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057424 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:14.059916 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057426 2539 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:14.059916 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057429 2539 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:14.059916 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057431 2539 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:14.059916 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057434 2539 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:14.059916 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057437 2539 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:14.059916 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057439 2539 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:14.059916 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057442 2539 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:14.060539 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057444 2539 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:14.060539 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057447 2539 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:14.060539 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057450 2539 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:14.060539 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057452 2539 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:14.060539 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057455 2539 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:14.060539 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057457 2539 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:14.060539 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057460 2539 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:14.060539 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057463 2539 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:14.060539 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057465 2539 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:14.060539 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057468 2539 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:14.060539 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057470 2539 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:14.060539 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057473 2539 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:14.060539 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057475 2539 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:14.060539 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057477 2539 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:14.060539 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057480 2539 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:14.060539 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057483 2539 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:14.060539 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057485 2539 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:14.060539 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057488 2539 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:14.060539 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057490 2539 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:14.060539 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057493 2539 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:14.061041 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057496 2539 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:14.061041 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057498 2539 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:14.061041 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057502 2539 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:14.061041 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057504 2539 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:14.061041 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057506 2539 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:14.061041 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057509 2539 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:14.061041 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057512 2539 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:14.061041 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057516 2539 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:14.061041 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057519 2539 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:14.061041 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057522 2539 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:14.061041 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057524 2539 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:14.061041 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:14.057527 2539 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:14.061041 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.057532 2539 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:16:14.061041 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.058181 2539 server.go:962] "Client rotation is on, will bootstrap in background" Apr 28 19:16:14.061041 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.060123 2539 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 28 19:16:14.061533 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.061045 2539 server.go:1019] "Starting client certificate rotation" Apr 28 19:16:14.061533 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.061151 2539 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 28 19:16:14.061533 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.061211 2539 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 28 19:16:14.089778 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.089747 2539 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 28 19:16:14.095735 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.095708 2539 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 28 19:16:14.113255 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.113231 2539 log.go:25] "Validated CRI v1 runtime API" Apr 28 19:16:14.118571 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.118553 2539 log.go:25] "Validated CRI v1 image API" Apr 28 19:16:14.119789 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.119772 2539 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 28 19:16:14.125136 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.125111 2539 fs.go:135] Filesystem UUIDs: map[5b2a6b97-4aed-4a87-b1ca-84d7c5177355:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 a315492a-9381-409a-a822-01bb69101a78:/dev/nvme0n1p4] Apr 28 19:16:14.125205 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.125134 2539 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 28 19:16:14.129365 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.129348 2539 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 28 19:16:14.132319 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.132198 2539 manager.go:217] Machine: {Timestamp:2026-04-28 19:16:14.128901774 +0000 UTC m=+0.425257129 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3193142 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec268ac401113fb1102956ab1c2d642c SystemUUID:ec268ac4-0111-3fb1-1029-56ab1c2d642c BootID:5c04adcb-0c82-4619-a223-146882ebf1dd Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:6e:19:a7:25:0f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:6e:19:a7:25:0f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5e:4c:5b:c3:43:53 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 28 19:16:14.132319 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.132315 2539 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 28 19:16:14.132433 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.132422 2539 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 28 19:16:14.133644 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.133618 2539 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 28 19:16:14.133794 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.133647 2539 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-206.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 28 19:16:14.133841 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.133804 2539 topology_manager.go:138] "Creating topology manager with none policy" Apr 28 19:16:14.133841 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.133813 2539 container_manager_linux.go:306] "Creating device plugin manager" Apr 28 19:16:14.133841 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.133830 2539 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 28 19:16:14.135459 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.135447 2539 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 28 19:16:14.136287 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.136277 2539 state_mem.go:36] "Initialized new in-memory state store" Apr 28 19:16:14.136604 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.136593 2539 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 28 19:16:14.139029 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.139020 2539 kubelet.go:491] "Attempting to sync node with API server" Apr 28 19:16:14.139069 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.139042 2539 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 28 19:16:14.139069 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.139058 2539 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 28 19:16:14.139069 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.139069 2539 kubelet.go:397] "Adding apiserver pod source" Apr 28 19:16:14.139166 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.139081 2539 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 28 19:16:14.140353 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.140339 2539 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 28 19:16:14.140409 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.140367 2539 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 28 19:16:14.143968 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.143952 2539 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 28 19:16:14.145178 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.145164 2539 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 28 19:16:14.146909 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.146898 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 28 19:16:14.146955 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.146915 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 28 19:16:14.146955 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.146921 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 28 19:16:14.146955 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.146928 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 28 19:16:14.146955 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.146936 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 28 19:16:14.146955 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.146942 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 28 19:16:14.146955 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.146948 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 28 19:16:14.147118 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.146961 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 28 19:16:14.147118 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.146968 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 28 19:16:14.147118 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.146973 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 28 19:16:14.147118 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.146986 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 28 19:16:14.147118 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.146995 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 28 19:16:14.148547 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.148536 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 28 19:16:14.148547 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.148547 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 28 19:16:14.152090 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.152075 2539 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 28 19:16:14.152165 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.152113 2539 server.go:1295] "Started kubelet" Apr 28 19:16:14.152235 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.152183 2539 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 28 19:16:14.152271 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.152192 2539 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 28 19:16:14.152305 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.152288 2539 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 28 19:16:14.153046 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.153016 2539 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-206.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 28 19:16:14.153142 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.153076 2539 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 28 19:16:14.153104 ip-10-0-143-206 systemd[1]: Started Kubernetes Kubelet. Apr 28 19:16:14.153297 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.153287 2539 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-206.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 28 19:16:14.153396 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.153366 2539 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 28 19:16:14.154954 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.154939 2539 server.go:317] "Adding debug handlers to kubelet server" Apr 28 19:16:14.162340 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.162318 2539 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 28 19:16:14.162461 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.162339 2539 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 28 19:16:14.163057 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.163006 2539 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 28 19:16:14.163057 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.163028 2539 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 28 19:16:14.163057 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.163044 2539 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 28 19:16:14.163242 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.163129 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-206.ec2.internal\" not found" Apr 28 19:16:14.163242 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.163152 2539 factory.go:55] Registering systemd factory Apr 28 19:16:14.163242 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.163173 2539 factory.go:223] Registration of the systemd container factory successfully Apr 28 19:16:14.163242 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.163179 2539 reconstruct.go:97] "Volume reconstruction finished" Apr 28 19:16:14.163242 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.163188 2539 reconciler.go:26] "Reconciler: start to sync state" Apr 28 19:16:14.163500 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.163486 2539 factory.go:153] Registering CRI-O factory Apr 28 19:16:14.163560 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.163505 2539 factory.go:223] Registration of the crio container factory successfully Apr 28 19:16:14.163560 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.163558 2539 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 28 19:16:14.163660 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.163583 2539 factory.go:103] Registering Raw factory Apr 28 19:16:14.163660 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.163598 2539 manager.go:1196] Started watching for new ooms in manager Apr 28 19:16:14.163853 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.162799 2539 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-206.ec2.internal.18aa9b50c4c61ad8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-206.ec2.internal,UID:ip-10-0-143-206.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-206.ec2.internal,},FirstTimestamp:2026-04-28 19:16:14.15208828 +0000 UTC m=+0.448443634,LastTimestamp:2026-04-28 19:16:14.15208828 +0000 UTC m=+0.448443634,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-206.ec2.internal,}" Apr 28 19:16:14.164053 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.164011 2539 manager.go:319] Starting recovery of all containers Apr 28 19:16:14.166447 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.166412 2539 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 28 19:16:14.171238 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.171209 2539 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 28 19:16:14.171988 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.171805 2539 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-143-206.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 28 19:16:14.174535 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.174516 2539 manager.go:324] Recovery completed Apr 28 19:16:14.179850 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.179832 2539 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:14.182117 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.182100 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-206.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:14.182183 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.182130 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-206.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:14.182183 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.182141 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-206.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:14.182644 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.182630 2539 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 28 19:16:14.182644 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.182640 2539 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 28 19:16:14.182723 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.182659 2539 state_mem.go:36] "Initialized new in-memory state store" Apr 28 19:16:14.183242 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.183227 2539 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ntx95" Apr 28 19:16:14.184214 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.184150 2539 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-206.ec2.internal.18aa9b50c69050b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-206.ec2.internal,UID:ip-10-0-143-206.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-143-206.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-143-206.ec2.internal,},FirstTimestamp:2026-04-28 19:16:14.182117554 +0000 UTC m=+0.478472909,LastTimestamp:2026-04-28 19:16:14.182117554 +0000 UTC m=+0.478472909,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-206.ec2.internal,}" Apr 28 19:16:14.185182 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.185168 2539 policy_none.go:49] "None policy: Start" Apr 28 19:16:14.185262 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.185187 2539 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 28 19:16:14.185262 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.185201 2539 state_mem.go:35] "Initializing new in-memory state store" Apr 28 19:16:14.194148 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.194127 2539 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ntx95" Apr 28 19:16:14.196553 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.196485 2539 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-206.ec2.internal.18aa9b50c6909551 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-206.ec2.internal,UID:ip-10-0-143-206.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-143-206.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-143-206.ec2.internal,},FirstTimestamp:2026-04-28 19:16:14.182135121 +0000 UTC m=+0.478490476,LastTimestamp:2026-04-28 19:16:14.182135121 +0000 UTC m=+0.478490476,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-206.ec2.internal,}" Apr 28 19:16:14.232064 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.232039 2539 manager.go:341] "Starting Device Plugin manager" Apr 28 19:16:14.232211 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.232094 2539 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 28 19:16:14.232211 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.232107 2539 server.go:85] "Starting device plugin registration server" Apr 28 19:16:14.232422 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.232407 2539 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 28 19:16:14.232488 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.232423 2539 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 28 19:16:14.232554 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.232540 2539 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 28 19:16:14.232633 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.232621 2539 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 28 19:16:14.232633 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.232631 2539 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 28 19:16:14.233175 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.233147 2539 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 28 19:16:14.233256 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.233199 2539 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-206.ec2.internal\" not found" Apr 28 19:16:14.265025 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.264985 2539 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 28 19:16:14.266306 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.266289 2539 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 28 19:16:14.266423 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.266317 2539 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 28 19:16:14.266423 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.266342 2539 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 28 19:16:14.266423 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.266351 2539 kubelet.go:2451] "Starting kubelet main sync loop" Apr 28 19:16:14.266423 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.266404 2539 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 28 19:16:14.271927 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.271911 2539 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:14.333458 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.333356 2539 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:14.334410 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.334391 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-206.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:14.334517 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.334429 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-206.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:14.334517 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.334445 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-206.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:14.334517 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.334476 2539 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-206.ec2.internal" Apr 28 19:16:14.345166 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.345145 2539 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-206.ec2.internal" Apr 28 19:16:14.345247 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.345169 2539 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-206.ec2.internal\": node \"ip-10-0-143-206.ec2.internal\" not found" Apr 28 19:16:14.366489 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.366464 2539 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-206.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-206.ec2.internal"] Apr 28 19:16:14.366588 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.366542 2539 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:14.367367 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.367348 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-206.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:14.367479 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.367395 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-206.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:14.367479 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.367408 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-206.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:14.368502 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.368489 2539 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:14.368669 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.368656 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-206.ec2.internal" Apr 28 19:16:14.368705 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.368684 2539 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:14.369209 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.369193 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-206.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:14.369286 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.369223 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-206.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:14.369286 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.369193 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-206.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:14.369286 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.369234 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-206.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:14.369286 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.369263 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-206.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:14.369286 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.369282 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-206.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:14.370326 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.370310 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-206.ec2.internal" Apr 28 19:16:14.370425 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.370337 2539 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:14.370948 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.370934 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-206.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:14.370999 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.370958 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-206.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:14.370999 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.370968 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-206.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:14.376924 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.376906 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-206.ec2.internal\" not found" Apr 28 19:16:14.399157 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.399135 2539 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-206.ec2.internal\" not found" node="ip-10-0-143-206.ec2.internal" Apr 28 19:16:14.403491 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.403476 2539 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-206.ec2.internal\" not found" node="ip-10-0-143-206.ec2.internal" Apr 28 19:16:14.464487 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.464462 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e5713c9e782ac5c4d7eb091b097c3dde-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-206.ec2.internal\" (UID: \"e5713c9e782ac5c4d7eb091b097c3dde\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-206.ec2.internal" Apr 28 19:16:14.464487 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.464490 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5713c9e782ac5c4d7eb091b097c3dde-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-206.ec2.internal\" (UID: \"e5713c9e782ac5c4d7eb091b097c3dde\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-206.ec2.internal" Apr 28 19:16:14.464661 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.464508 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6d10770512520c6fc076a2363490adc7-config\") pod \"kube-apiserver-proxy-ip-10-0-143-206.ec2.internal\" (UID: \"6d10770512520c6fc076a2363490adc7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-206.ec2.internal" Apr 28 19:16:14.477767 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.477744 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-206.ec2.internal\" not found" Apr 28 19:16:14.565396 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.565344 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e5713c9e782ac5c4d7eb091b097c3dde-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-206.ec2.internal\" (UID: \"e5713c9e782ac5c4d7eb091b097c3dde\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-206.ec2.internal" Apr 28 19:16:14.565569 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.565405 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5713c9e782ac5c4d7eb091b097c3dde-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-206.ec2.internal\" (UID: \"e5713c9e782ac5c4d7eb091b097c3dde\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-206.ec2.internal" Apr 28 19:16:14.565569 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.565447 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6d10770512520c6fc076a2363490adc7-config\") pod \"kube-apiserver-proxy-ip-10-0-143-206.ec2.internal\" (UID: \"6d10770512520c6fc076a2363490adc7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-206.ec2.internal" Apr 28 19:16:14.565569 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.565459 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e5713c9e782ac5c4d7eb091b097c3dde-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-206.ec2.internal\" (UID: \"e5713c9e782ac5c4d7eb091b097c3dde\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-206.ec2.internal" Apr 28 19:16:14.565569 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.565495 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6d10770512520c6fc076a2363490adc7-config\") pod \"kube-apiserver-proxy-ip-10-0-143-206.ec2.internal\" (UID: \"6d10770512520c6fc076a2363490adc7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-206.ec2.internal" Apr 28 19:16:14.565569 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.565516 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5713c9e782ac5c4d7eb091b097c3dde-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-206.ec2.internal\" (UID: \"e5713c9e782ac5c4d7eb091b097c3dde\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-206.ec2.internal" Apr 28 19:16:14.578438 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.578411 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-206.ec2.internal\" not found" Apr 28 19:16:14.678701 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.678630 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-206.ec2.internal\" not found" Apr 28 19:16:14.700830 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.700795 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-206.ec2.internal" Apr 28 19:16:14.706297 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.706274 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-206.ec2.internal" Apr 28 19:16:14.778792 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.778755 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-206.ec2.internal\" not found" Apr 28 19:16:14.879360 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.879311 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-206.ec2.internal\" not found" Apr 28 19:16:14.970581 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:14.970521 2539 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:14.980111 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:14.980089 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-206.ec2.internal\" not found" Apr 28 19:16:15.061323 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:15.061287 2539 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 28 19:16:15.061877 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:15.061493 2539 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 28 19:16:15.061877 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:15.061494 2539 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 28 19:16:15.080759 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:15.080731 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-206.ec2.internal\" not found" Apr 28 19:16:15.162500 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:15.162473 2539 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 28 19:16:15.171068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:15.171036 2539 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 28 19:16:15.181621 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:15.181596 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-206.ec2.internal\" not found" Apr 28 19:16:15.197796 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:15.197749 2539 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-27 19:11:14 +0000 UTC" deadline="2027-12-10 02:13:11.370379873 +0000 UTC" Apr 28 19:16:15.197796 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:15.197791 2539 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14166h56m56.17259201s" Apr 28 19:16:15.199861 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:15.199844 2539 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-h9jnm" Apr 28 19:16:15.204889 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:15.204851 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d10770512520c6fc076a2363490adc7.slice/crio-ef33d104ef2f3a09e641bb4c75790856f0e419d0dd2a242997228bf838e80d65 WatchSource:0}: Error finding container ef33d104ef2f3a09e641bb4c75790856f0e419d0dd2a242997228bf838e80d65: Status 404 returned error can't find the container with id ef33d104ef2f3a09e641bb4c75790856f0e419d0dd2a242997228bf838e80d65 Apr 28 19:16:15.205342 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:15.205323 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5713c9e782ac5c4d7eb091b097c3dde.slice/crio-2d9c4fe4fa02c850842772f83d8a62d94460d6fe6ac17dd7391cbec8b73da5b1 WatchSource:0}: Error finding container 2d9c4fe4fa02c850842772f83d8a62d94460d6fe6ac17dd7391cbec8b73da5b1: Status 404 returned error can't find the container with id 2d9c4fe4fa02c850842772f83d8a62d94460d6fe6ac17dd7391cbec8b73da5b1 Apr 28 19:16:15.206648 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:15.206630 2539 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-h9jnm" Apr 28 19:16:15.210027 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:15.210013 2539 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:16:15.269332 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:15.269232 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-206.ec2.internal" event={"ID":"e5713c9e782ac5c4d7eb091b097c3dde","Type":"ContainerStarted","Data":"2d9c4fe4fa02c850842772f83d8a62d94460d6fe6ac17dd7391cbec8b73da5b1"} Apr 28 19:16:15.270192 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:15.270170 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-206.ec2.internal" event={"ID":"6d10770512520c6fc076a2363490adc7","Type":"ContainerStarted","Data":"ef33d104ef2f3a09e641bb4c75790856f0e419d0dd2a242997228bf838e80d65"} Apr 28 19:16:15.282321 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:15.282303 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-206.ec2.internal\" not found" Apr 28 19:16:15.382810 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:15.382778 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-206.ec2.internal\" not found" Apr 28 19:16:15.483390 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:15.483349 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-206.ec2.internal\" not found" Apr 28 19:16:15.504612 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:15.504584 2539 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:15.584196 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:15.584103 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-206.ec2.internal\" not found" Apr 28 19:16:15.685131 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:15.685093 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-206.ec2.internal\" not found" Apr 28 19:16:15.689422 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:15.689245 2539 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:15.763328 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:15.763294 2539 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-206.ec2.internal" Apr 28 19:16:15.779824 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:15.779795 2539 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 28 19:16:15.780899 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:15.780871 2539 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-206.ec2.internal" Apr 28 19:16:15.798863 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:15.798808 2539 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 28 19:16:16.140800 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.140770 2539 apiserver.go:52] "Watching apiserver" Apr 28 19:16:16.147403 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.147362 2539 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 28 19:16:16.147764 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.147739 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-206.ec2.internal","openshift-multus/multus-additional-cni-plugins-j682m","openshift-multus/multus-nlr5m","openshift-multus/network-metrics-daemon-txdd9","openshift-network-diagnostics/network-check-target-wg74q","kube-system/kube-apiserver-proxy-ip-10-0-143-206.ec2.internal","openshift-dns/node-resolver-4kjrn","openshift-image-registry/node-ca-g4cj8","openshift-network-operator/iptables-alerter-sfcs7","openshift-ovn-kubernetes/ovnkube-node-977nw","kube-system/konnectivity-agent-c9r6q","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k","openshift-cluster-node-tuning-operator/tuned-9k5fw"] Apr 28 19:16:16.150431 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.150394 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4kjrn" Apr 28 19:16:16.151647 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.151604 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.151902 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.151863 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.152969 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.152898 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:16.152969 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.152958 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 28 19:16:16.152969 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:16.152996 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-txdd9" podUID="2c344b2c-cf71-45b1-9143-e86be8d1b7b5" Apr 28 19:16:16.153222 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.153063 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4z7ww\"" Apr 28 19:16:16.153222 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.153199 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 28 19:16:16.153933 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.153909 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:16.154049 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:16.153975 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wg74q" podUID="dff9f9ea-63cc-4089-bb7e-e9fcb292c695" Apr 28 19:16:16.154121 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.154091 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 28 19:16:16.154121 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.154108 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 28 19:16:16.154224 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.154151 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 28 19:16:16.154434 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.154421 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 28 19:16:16.155989 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.155971 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g4cj8" Apr 28 19:16:16.155989 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.155982 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 28 19:16:16.157359 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.157344 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-sfcs7" Apr 28 19:16:16.157546 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.157529 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.157622 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.157565 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-jhqhr\"" Apr 28 19:16:16.158268 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.158248 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 28 19:16:16.158404 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.158363 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-659j2\"" Apr 28 19:16:16.159296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.158842 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 28 19:16:16.159296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.159124 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 28 19:16:16.159865 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.159837 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 28 19:16:16.159954 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.159936 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-c9r6q" Apr 28 19:16:16.160275 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.160105 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.160572 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.160555 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 28 19:16:16.161087 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.161069 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:16:16.161485 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.161468 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.164057 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.164039 2539 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 28 19:16:16.164279 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.164245 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 28 19:16:16.164362 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.164293 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-8x9t6\"" Apr 28 19:16:16.164448 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.164406 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 28 19:16:16.164504 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.164492 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 28 19:16:16.164589 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.164568 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 28 19:16:16.164714 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.164697 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 28 19:16:16.164779 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.164713 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 28 19:16:16.164779 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.164774 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 28 19:16:16.164874 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.164779 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jzmcr\"" Apr 28 19:16:16.164874 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.164802 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 28 19:16:16.164874 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.164498 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:16:16.165022 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.164888 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 28 19:16:16.165022 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.164930 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 28 19:16:16.165518 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.165499 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-fxtrj\"" Apr 28 19:16:16.165781 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.165760 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 28 19:16:16.165848 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.165794 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-w4mjc\"" Apr 28 19:16:16.165905 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.165857 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 28 19:16:16.168099 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.165711 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 28 19:16:16.168099 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.166263 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zj9wk\"" Apr 28 19:16:16.168099 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.166512 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-k9gmt\"" Apr 28 19:16:16.172457 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.172437 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-run-systemd\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.172534 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.172465 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6x98\" (UniqueName: \"kubernetes.io/projected/c64d6ced-de54-4be0-9661-bf00d68c4ce0-kube-api-access-g6x98\") pod \"aws-ebs-csi-driver-node-vkw4k\" (UID: \"c64d6ced-de54-4be0-9661-bf00d68c4ce0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.172534 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.172483 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0daae916-5659-44ea-96b4-ed96cbfa9da3-host\") pod \"node-ca-g4cj8\" (UID: \"0daae916-5659-44ea-96b4-ed96cbfa9da3\") " pod="openshift-image-registry/node-ca-g4cj8" Apr 28 19:16:16.172534 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.172501 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5217b4ac-ec08-4f15-af88-99f26535e549-tmp-dir\") pod \"node-resolver-4kjrn\" (UID: \"5217b4ac-ec08-4f15-af88-99f26535e549\") " pod="openshift-dns/node-resolver-4kjrn" Apr 28 19:16:16.172691 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.172569 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c64d6ced-de54-4be0-9661-bf00d68c4ce0-socket-dir\") pod \"aws-ebs-csi-driver-node-vkw4k\" (UID: \"c64d6ced-de54-4be0-9661-bf00d68c4ce0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.172691 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.172622 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c64d6ced-de54-4be0-9661-bf00d68c4ce0-sys-fs\") pod \"aws-ebs-csi-driver-node-vkw4k\" (UID: \"c64d6ced-de54-4be0-9661-bf00d68c4ce0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.172691 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.172655 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-etc-systemd\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.172691 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.172683 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5qzz\" (UniqueName: \"kubernetes.io/projected/58213e63-9543-4438-bbbf-d242d52abc8f-kube-api-access-x5qzz\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.172857 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.172710 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-host-var-lib-kubelet\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.172857 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.172736 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ptnz\" (UniqueName: \"kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz\") pod \"network-check-target-wg74q\" (UID: \"dff9f9ea-63cc-4089-bb7e-e9fcb292c695\") " pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:16.172857 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.172761 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-os-release\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.172857 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.172800 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-multus-socket-dir-parent\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.172857 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.172838 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-host-slash\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.173026 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.172875 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5ced465f-4a51-4441-b363-efac6c32deb0-ovnkube-script-lib\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.173026 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.172904 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fr5g\" (UniqueName: \"kubernetes.io/projected/5ced465f-4a51-4441-b363-efac6c32deb0-kube-api-access-5fr5g\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.173026 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.172958 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58213e63-9543-4438-bbbf-d242d52abc8f-system-cni-dir\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.173026 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.172980 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/58213e63-9543-4438-bbbf-d242d52abc8f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.173026 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.172997 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-multus-conf-dir\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.173026 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173011 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-etc-kubernetes\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.173026 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173025 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-var-lib-openvswitch\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.173234 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173039 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-run\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.173234 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173054 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/006eba5f-c69e-415e-b993-2a2c72ae4df3-konnectivity-ca\") pod \"konnectivity-agent-c9r6q\" (UID: \"006eba5f-c69e-415e-b993-2a2c72ae4df3\") " pod="kube-system/konnectivity-agent-c9r6q" Apr 28 19:16:16.173234 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173079 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-host\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.173234 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173107 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/58213e63-9543-4438-bbbf-d242d52abc8f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.173234 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173131 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-host-cni-bin\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.173234 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173152 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a723c52e-a9e7-4b73-852d-22d1fd084cfa-iptables-alerter-script\") pod \"iptables-alerter-sfcs7\" (UID: \"a723c52e-a9e7-4b73-852d-22d1fd084cfa\") " pod="openshift-network-operator/iptables-alerter-sfcs7" Apr 28 19:16:16.173234 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173179 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-multus-daemon-config\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.173234 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173204 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-host-run-netns\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.173234 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173219 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-run-ovn\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.173545 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173260 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-host-cni-netd\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.173545 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173295 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-lib-modules\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.173545 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173325 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/69d58ea3-6f19-4956-a782-6313891c2513-tmp\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.173545 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173355 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5217b4ac-ec08-4f15-af88-99f26535e549-hosts-file\") pod \"node-resolver-4kjrn\" (UID: \"5217b4ac-ec08-4f15-af88-99f26535e549\") " pod="openshift-dns/node-resolver-4kjrn" Apr 28 19:16:16.173545 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173403 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-cni-binary-copy\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.173545 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173426 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-run-openvswitch\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.173545 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173451 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-etc-sysconfig\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.173545 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173475 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-host-var-lib-cni-multus\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.173545 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173522 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvnkl\" (UniqueName: \"kubernetes.io/projected/0daae916-5659-44ea-96b4-ed96cbfa9da3-kube-api-access-cvnkl\") pod \"node-ca-g4cj8\" (UID: \"0daae916-5659-44ea-96b4-ed96cbfa9da3\") " pod="openshift-image-registry/node-ca-g4cj8" Apr 28 19:16:16.173545 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173545 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4tt7\" (UniqueName: \"kubernetes.io/projected/69d58ea3-6f19-4956-a782-6313891c2513-kube-api-access-r4tt7\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.173930 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173568 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/58213e63-9543-4438-bbbf-d242d52abc8f-cnibin\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.173930 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173593 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/58213e63-9543-4438-bbbf-d242d52abc8f-os-release\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.173930 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173615 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-host-run-k8s-cni-cncf-io\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.173930 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173637 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-host-kubelet\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.173930 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173675 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-etc-openvswitch\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.173930 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173698 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/006eba5f-c69e-415e-b993-2a2c72ae4df3-agent-certs\") pod \"konnectivity-agent-c9r6q\" (UID: \"006eba5f-c69e-415e-b993-2a2c72ae4df3\") " pod="kube-system/konnectivity-agent-c9r6q" Apr 28 19:16:16.173930 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173721 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx8x6\" (UniqueName: \"kubernetes.io/projected/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-kube-api-access-lx8x6\") pod \"network-metrics-daemon-txdd9\" (UID: \"2c344b2c-cf71-45b1-9143-e86be8d1b7b5\") " pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:16.173930 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173742 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/69d58ea3-6f19-4956-a782-6313891c2513-etc-tuned\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.173930 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173774 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-host-var-lib-cni-bin\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.173930 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173800 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-host-run-multus-certs\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.173930 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173830 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.173930 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173855 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5ced465f-4a51-4441-b363-efac6c32deb0-ovnkube-config\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.173930 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173882 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5ced465f-4a51-4441-b363-efac6c32deb0-env-overrides\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.173930 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173902 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a723c52e-a9e7-4b73-852d-22d1fd084cfa-host-slash\") pod \"iptables-alerter-sfcs7\" (UID: \"a723c52e-a9e7-4b73-852d-22d1fd084cfa\") " pod="openshift-network-operator/iptables-alerter-sfcs7" Apr 28 19:16:16.173930 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173926 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqwd9\" (UniqueName: \"kubernetes.io/projected/5217b4ac-ec08-4f15-af88-99f26535e549-kube-api-access-tqwd9\") pod \"node-resolver-4kjrn\" (UID: \"5217b4ac-ec08-4f15-af88-99f26535e549\") " pod="openshift-dns/node-resolver-4kjrn" Apr 28 19:16:16.174436 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173951 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/58213e63-9543-4438-bbbf-d242d52abc8f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.174436 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.173976 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-host-run-netns\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.174436 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174005 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c64d6ced-de54-4be0-9661-bf00d68c4ce0-registration-dir\") pod \"aws-ebs-csi-driver-node-vkw4k\" (UID: \"c64d6ced-de54-4be0-9661-bf00d68c4ce0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.174436 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174028 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr2rh\" (UniqueName: \"kubernetes.io/projected/a723c52e-a9e7-4b73-852d-22d1fd084cfa-kube-api-access-lr2rh\") pod \"iptables-alerter-sfcs7\" (UID: \"a723c52e-a9e7-4b73-852d-22d1fd084cfa\") " pod="openshift-network-operator/iptables-alerter-sfcs7" Apr 28 19:16:16.174436 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174050 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-etc-modprobe-d\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.174436 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174071 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-cnibin\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.174436 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174092 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-node-log\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.174436 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174114 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ced465f-4a51-4441-b363-efac6c32deb0-ovn-node-metrics-cert\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.174436 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174150 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-etc-sysctl-conf\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.174436 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174172 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-sys\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.174436 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174196 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c64d6ced-de54-4be0-9661-bf00d68c4ce0-device-dir\") pod \"aws-ebs-csi-driver-node-vkw4k\" (UID: \"c64d6ced-de54-4be0-9661-bf00d68c4ce0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.174436 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174219 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0daae916-5659-44ea-96b4-ed96cbfa9da3-serviceca\") pod \"node-ca-g4cj8\" (UID: \"0daae916-5659-44ea-96b4-ed96cbfa9da3\") " pod="openshift-image-registry/node-ca-g4cj8" Apr 28 19:16:16.174436 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174241 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-etc-sysctl-d\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.174436 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174263 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/58213e63-9543-4438-bbbf-d242d52abc8f-cni-binary-copy\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.174436 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174286 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-systemd-units\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.174436 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174308 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c64d6ced-de54-4be0-9661-bf00d68c4ce0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vkw4k\" (UID: \"c64d6ced-de54-4be0-9661-bf00d68c4ce0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.175069 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174328 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-var-lib-kubelet\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.175069 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174349 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-host-run-ovn-kubernetes\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.175069 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174387 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c64d6ced-de54-4be0-9661-bf00d68c4ce0-etc-selinux\") pod \"aws-ebs-csi-driver-node-vkw4k\" (UID: \"c64d6ced-de54-4be0-9661-bf00d68c4ce0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.175069 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174423 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs\") pod \"network-metrics-daemon-txdd9\" (UID: \"2c344b2c-cf71-45b1-9143-e86be8d1b7b5\") " pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:16.175069 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174454 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-system-cni-dir\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.175069 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174476 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-multus-cni-dir\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.175069 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174497 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-log-socket\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.175069 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174550 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-etc-kubernetes\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.175069 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174570 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-hostroot\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.175069 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.174587 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc98d\" (UniqueName: \"kubernetes.io/projected/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-kube-api-access-dc98d\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.207468 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.207434 2539 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-27 19:11:15 +0000 UTC" deadline="2027-10-19 01:37:52.76250763 +0000 UTC" Apr 28 19:16:16.207468 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.207466 2539 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12918h21m36.555044269s" Apr 28 19:16:16.275813 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.275773 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58213e63-9543-4438-bbbf-d242d52abc8f-system-cni-dir\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.275813 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.275814 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/58213e63-9543-4438-bbbf-d242d52abc8f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.276032 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.275839 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-multus-conf-dir\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.276032 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.275859 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-etc-kubernetes\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.276032 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.275881 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-var-lib-openvswitch\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.276032 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.275896 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58213e63-9543-4438-bbbf-d242d52abc8f-system-cni-dir\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.276032 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.275903 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-run\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.276032 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.275951 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-run\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.276032 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.275961 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/006eba5f-c69e-415e-b993-2a2c72ae4df3-konnectivity-ca\") pod \"konnectivity-agent-c9r6q\" (UID: \"006eba5f-c69e-415e-b993-2a2c72ae4df3\") " pod="kube-system/konnectivity-agent-c9r6q" Apr 28 19:16:16.276032 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.275988 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-host\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.276032 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.275992 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-multus-conf-dir\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.276032 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276016 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/58213e63-9543-4438-bbbf-d242d52abc8f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.276032 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276030 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-etc-kubernetes\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.276565 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276046 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-host-cni-bin\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.276565 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276067 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-var-lib-openvswitch\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.276565 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276071 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a723c52e-a9e7-4b73-852d-22d1fd084cfa-iptables-alerter-script\") pod \"iptables-alerter-sfcs7\" (UID: \"a723c52e-a9e7-4b73-852d-22d1fd084cfa\") " pod="openshift-network-operator/iptables-alerter-sfcs7" Apr 28 19:16:16.276565 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276097 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-multus-daemon-config\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.276565 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276122 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-host-run-netns\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.276565 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276147 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-run-ovn\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.276565 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276185 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-host-cni-netd\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.276565 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276208 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-lib-modules\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.276565 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276231 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/69d58ea3-6f19-4956-a782-6313891c2513-tmp\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.276565 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276255 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5217b4ac-ec08-4f15-af88-99f26535e549-hosts-file\") pod \"node-resolver-4kjrn\" (UID: \"5217b4ac-ec08-4f15-af88-99f26535e549\") " pod="openshift-dns/node-resolver-4kjrn" Apr 28 19:16:16.276565 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276280 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-cni-binary-copy\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.276565 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276304 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-run-openvswitch\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.276565 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276330 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-etc-sysconfig\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.276565 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276358 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-host-var-lib-cni-multus\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.276565 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276400 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvnkl\" (UniqueName: \"kubernetes.io/projected/0daae916-5659-44ea-96b4-ed96cbfa9da3-kube-api-access-cvnkl\") pod \"node-ca-g4cj8\" (UID: \"0daae916-5659-44ea-96b4-ed96cbfa9da3\") " pod="openshift-image-registry/node-ca-g4cj8" Apr 28 19:16:16.276565 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276425 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4tt7\" (UniqueName: \"kubernetes.io/projected/69d58ea3-6f19-4956-a782-6313891c2513-kube-api-access-r4tt7\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.276565 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276452 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/58213e63-9543-4438-bbbf-d242d52abc8f-cnibin\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.277340 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276466 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/58213e63-9543-4438-bbbf-d242d52abc8f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.277340 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276477 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/58213e63-9543-4438-bbbf-d242d52abc8f-os-release\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.277340 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276504 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-host-run-k8s-cni-cncf-io\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.277340 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276527 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-host-kubelet\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.277340 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276552 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-etc-openvswitch\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.277340 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276555 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/58213e63-9543-4438-bbbf-d242d52abc8f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.277340 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276576 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/006eba5f-c69e-415e-b993-2a2c72ae4df3-agent-certs\") pod \"konnectivity-agent-c9r6q\" (UID: \"006eba5f-c69e-415e-b993-2a2c72ae4df3\") " pod="kube-system/konnectivity-agent-c9r6q" Apr 28 19:16:16.277340 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276627 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-host\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.277340 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276629 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lx8x6\" (UniqueName: \"kubernetes.io/projected/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-kube-api-access-lx8x6\") pod \"network-metrics-daemon-txdd9\" (UID: \"2c344b2c-cf71-45b1-9143-e86be8d1b7b5\") " pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:16.277340 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276671 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/69d58ea3-6f19-4956-a782-6313891c2513-etc-tuned\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.277340 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276698 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-host-var-lib-cni-bin\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.277340 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276727 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-host-run-multus-certs\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.277340 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276754 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.277340 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276779 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5ced465f-4a51-4441-b363-efac6c32deb0-ovnkube-config\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.277340 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276805 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5ced465f-4a51-4441-b363-efac6c32deb0-env-overrides\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.277340 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276830 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a723c52e-a9e7-4b73-852d-22d1fd084cfa-host-slash\") pod \"iptables-alerter-sfcs7\" (UID: \"a723c52e-a9e7-4b73-852d-22d1fd084cfa\") " pod="openshift-network-operator/iptables-alerter-sfcs7" Apr 28 19:16:16.277340 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276875 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqwd9\" (UniqueName: \"kubernetes.io/projected/5217b4ac-ec08-4f15-af88-99f26535e549-kube-api-access-tqwd9\") pod \"node-resolver-4kjrn\" (UID: \"5217b4ac-ec08-4f15-af88-99f26535e549\") " pod="openshift-dns/node-resolver-4kjrn" Apr 28 19:16:16.278173 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276914 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/58213e63-9543-4438-bbbf-d242d52abc8f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.278173 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276941 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/006eba5f-c69e-415e-b993-2a2c72ae4df3-konnectivity-ca\") pod \"konnectivity-agent-c9r6q\" (UID: \"006eba5f-c69e-415e-b993-2a2c72ae4df3\") " pod="kube-system/konnectivity-agent-c9r6q" Apr 28 19:16:16.278173 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276985 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-host-var-lib-cni-bin\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.278173 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276995 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-host-run-netns\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.278173 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277027 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-host-run-multus-certs\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.278173 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277072 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.278173 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.276942 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-host-run-netns\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.278173 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277254 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c64d6ced-de54-4be0-9661-bf00d68c4ce0-registration-dir\") pod \"aws-ebs-csi-driver-node-vkw4k\" (UID: \"c64d6ced-de54-4be0-9661-bf00d68c4ce0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.278173 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277275 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lr2rh\" (UniqueName: \"kubernetes.io/projected/a723c52e-a9e7-4b73-852d-22d1fd084cfa-kube-api-access-lr2rh\") pod \"iptables-alerter-sfcs7\" (UID: \"a723c52e-a9e7-4b73-852d-22d1fd084cfa\") " pod="openshift-network-operator/iptables-alerter-sfcs7" Apr 28 19:16:16.278173 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277295 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-etc-modprobe-d\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.278173 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277313 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-cnibin\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.278173 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277329 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-node-log\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.278173 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277346 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ced465f-4a51-4441-b363-efac6c32deb0-ovn-node-metrics-cert\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.278173 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277364 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-etc-sysctl-conf\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.278173 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277401 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-sys\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.278173 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277408 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a723c52e-a9e7-4b73-852d-22d1fd084cfa-iptables-alerter-script\") pod \"iptables-alerter-sfcs7\" (UID: \"a723c52e-a9e7-4b73-852d-22d1fd084cfa\") " pod="openshift-network-operator/iptables-alerter-sfcs7" Apr 28 19:16:16.278173 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277420 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c64d6ced-de54-4be0-9661-bf00d68c4ce0-device-dir\") pod \"aws-ebs-csi-driver-node-vkw4k\" (UID: \"c64d6ced-de54-4be0-9661-bf00d68c4ce0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.278903 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277436 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0daae916-5659-44ea-96b4-ed96cbfa9da3-serviceca\") pod \"node-ca-g4cj8\" (UID: \"0daae916-5659-44ea-96b4-ed96cbfa9da3\") " pod="openshift-image-registry/node-ca-g4cj8" Apr 28 19:16:16.278903 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277452 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-etc-sysctl-d\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.278903 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277469 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/58213e63-9543-4438-bbbf-d242d52abc8f-cni-binary-copy\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.278903 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277485 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-systemd-units\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.278903 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277634 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5ced465f-4a51-4441-b363-efac6c32deb0-ovnkube-config\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.278903 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277656 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5ced465f-4a51-4441-b363-efac6c32deb0-env-overrides\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.278903 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277690 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-host-cni-bin\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.278903 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277713 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a723c52e-a9e7-4b73-852d-22d1fd084cfa-host-slash\") pod \"iptables-alerter-sfcs7\" (UID: \"a723c52e-a9e7-4b73-852d-22d1fd084cfa\") " pod="openshift-network-operator/iptables-alerter-sfcs7" Apr 28 19:16:16.278903 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277760 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-etc-sysconfig\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.278903 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277806 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-host-run-netns\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.278903 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277843 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/58213e63-9543-4438-bbbf-d242d52abc8f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.278903 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277848 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-run-ovn\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.278903 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277882 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-host-cni-netd\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.278903 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277892 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-multus-daemon-config\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.278903 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277924 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c64d6ced-de54-4be0-9661-bf00d68c4ce0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vkw4k\" (UID: \"c64d6ced-de54-4be0-9661-bf00d68c4ce0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.278903 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.278015 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-etc-sysctl-conf\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.278903 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.278022 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-lib-modules\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.278903 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.278036 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5217b4ac-ec08-4f15-af88-99f26535e549-hosts-file\") pod \"node-resolver-4kjrn\" (UID: \"5217b4ac-ec08-4f15-af88-99f26535e549\") " pod="openshift-dns/node-resolver-4kjrn" Apr 28 19:16:16.279760 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.278043 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-sys\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.279760 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277249 2539 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 28 19:16:16.279760 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.278105 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c64d6ced-de54-4be0-9661-bf00d68c4ce0-registration-dir\") pod \"aws-ebs-csi-driver-node-vkw4k\" (UID: \"c64d6ced-de54-4be0-9661-bf00d68c4ce0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.279760 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.278196 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-etc-sysctl-d\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.279760 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.278272 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/58213e63-9543-4438-bbbf-d242d52abc8f-cnibin\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.279760 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.278342 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/58213e63-9543-4438-bbbf-d242d52abc8f-cni-binary-copy\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.279760 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.277886 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c64d6ced-de54-4be0-9661-bf00d68c4ce0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vkw4k\" (UID: \"c64d6ced-de54-4be0-9661-bf00d68c4ce0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.279760 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.278400 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-node-log\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.279760 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.278272 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-etc-modprobe-d\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.279760 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.278940 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/58213e63-9543-4438-bbbf-d242d52abc8f-os-release\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.279760 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.278980 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-host-var-lib-cni-multus\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.279760 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.278387 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-cnibin\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.279760 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279148 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-host-kubelet\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.279760 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279209 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-host-run-k8s-cni-cncf-io\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.279760 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279212 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-etc-openvswitch\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.279760 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279267 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-var-lib-kubelet\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.279760 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279301 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-host-run-ovn-kubernetes\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.279760 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279335 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c64d6ced-de54-4be0-9661-bf00d68c4ce0-etc-selinux\") pod \"aws-ebs-csi-driver-node-vkw4k\" (UID: \"c64d6ced-de54-4be0-9661-bf00d68c4ce0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.280580 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279365 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs\") pod \"network-metrics-daemon-txdd9\" (UID: \"2c344b2c-cf71-45b1-9143-e86be8d1b7b5\") " pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:16.280580 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279410 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0daae916-5659-44ea-96b4-ed96cbfa9da3-serviceca\") pod \"node-ca-g4cj8\" (UID: \"0daae916-5659-44ea-96b4-ed96cbfa9da3\") " pod="openshift-image-registry/node-ca-g4cj8" Apr 28 19:16:16.280580 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279418 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-system-cni-dir\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.280580 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279452 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-multus-cni-dir\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.280580 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279481 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-log-socket\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.280580 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279488 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-var-lib-kubelet\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.280580 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279610 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c64d6ced-de54-4be0-9661-bf00d68c4ce0-etc-selinux\") pod \"aws-ebs-csi-driver-node-vkw4k\" (UID: \"c64d6ced-de54-4be0-9661-bf00d68c4ce0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.280580 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279659 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-run-openvswitch\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.280580 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279539 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-etc-kubernetes\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.280580 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279717 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-systemd-units\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.280580 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279728 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-hostroot\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.280580 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279761 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-host-run-ovn-kubernetes\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.280580 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279767 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dc98d\" (UniqueName: \"kubernetes.io/projected/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-kube-api-access-dc98d\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.280580 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279810 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-run-systemd\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.280580 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279816 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c64d6ced-de54-4be0-9661-bf00d68c4ce0-device-dir\") pod \"aws-ebs-csi-driver-node-vkw4k\" (UID: \"c64d6ced-de54-4be0-9661-bf00d68c4ce0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.280580 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279845 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6x98\" (UniqueName: \"kubernetes.io/projected/c64d6ced-de54-4be0-9661-bf00d68c4ce0-kube-api-access-g6x98\") pod \"aws-ebs-csi-driver-node-vkw4k\" (UID: \"c64d6ced-de54-4be0-9661-bf00d68c4ce0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.280580 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279866 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-run-systemd\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.280580 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279877 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0daae916-5659-44ea-96b4-ed96cbfa9da3-host\") pod \"node-ca-g4cj8\" (UID: \"0daae916-5659-44ea-96b4-ed96cbfa9da3\") " pod="openshift-image-registry/node-ca-g4cj8" Apr 28 19:16:16.281275 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279916 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5217b4ac-ec08-4f15-af88-99f26535e549-tmp-dir\") pod \"node-resolver-4kjrn\" (UID: \"5217b4ac-ec08-4f15-af88-99f26535e549\") " pod="openshift-dns/node-resolver-4kjrn" Apr 28 19:16:16.281275 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.279953 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-log-socket\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.281275 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.280041 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0daae916-5659-44ea-96b4-ed96cbfa9da3-host\") pod \"node-ca-g4cj8\" (UID: \"0daae916-5659-44ea-96b4-ed96cbfa9da3\") " pod="openshift-image-registry/node-ca-g4cj8" Apr 28 19:16:16.281275 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.280082 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-etc-kubernetes\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.281275 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.280108 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c64d6ced-de54-4be0-9661-bf00d68c4ce0-socket-dir\") pod \"aws-ebs-csi-driver-node-vkw4k\" (UID: \"c64d6ced-de54-4be0-9661-bf00d68c4ce0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.281275 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.280141 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c64d6ced-de54-4be0-9661-bf00d68c4ce0-sys-fs\") pod \"aws-ebs-csi-driver-node-vkw4k\" (UID: \"c64d6ced-de54-4be0-9661-bf00d68c4ce0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.281275 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.280148 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-multus-cni-dir\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.281275 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:16.280151 2539 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:16.281275 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.280211 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-etc-systemd\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.281275 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.280256 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5qzz\" (UniqueName: \"kubernetes.io/projected/58213e63-9543-4438-bbbf-d242d52abc8f-kube-api-access-x5qzz\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.281275 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.280264 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-hostroot\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.281275 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.280295 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-host-var-lib-kubelet\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.281275 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.280304 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c64d6ced-de54-4be0-9661-bf00d68c4ce0-socket-dir\") pod \"aws-ebs-csi-driver-node-vkw4k\" (UID: \"c64d6ced-de54-4be0-9661-bf00d68c4ce0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.281275 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.280458 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c64d6ced-de54-4be0-9661-bf00d68c4ce0-sys-fs\") pod \"aws-ebs-csi-driver-node-vkw4k\" (UID: \"c64d6ced-de54-4be0-9661-bf00d68c4ce0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.281275 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.280509 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-system-cni-dir\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.281275 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:16.280589 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs podName:2c344b2c-cf71-45b1-9143-e86be8d1b7b5 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:16.78056576 +0000 UTC m=+3.076921120 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs") pod "network-metrics-daemon-txdd9" (UID: "2c344b2c-cf71-45b1-9143-e86be8d1b7b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:16.281275 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.280692 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/69d58ea3-6f19-4956-a782-6313891c2513-etc-systemd\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.282070 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.280817 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-host-var-lib-kubelet\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.282070 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.280892 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ptnz\" (UniqueName: \"kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz\") pod \"network-check-target-wg74q\" (UID: \"dff9f9ea-63cc-4089-bb7e-e9fcb292c695\") " pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:16.282070 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.280928 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-cni-binary-copy\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.282070 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.280960 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-os-release\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.282070 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.281226 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-multus-socket-dir-parent\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.282070 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.281264 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-host-slash\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.282070 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.281291 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5ced465f-4a51-4441-b363-efac6c32deb0-ovnkube-script-lib\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.282070 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.281316 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-os-release\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.282070 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.281363 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fr5g\" (UniqueName: \"kubernetes.io/projected/5ced465f-4a51-4441-b363-efac6c32deb0-kube-api-access-5fr5g\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.282070 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.281389 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-multus-socket-dir-parent\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.282070 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.281463 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5217b4ac-ec08-4f15-af88-99f26535e549-tmp-dir\") pod \"node-resolver-4kjrn\" (UID: \"5217b4ac-ec08-4f15-af88-99f26535e549\") " pod="openshift-dns/node-resolver-4kjrn" Apr 28 19:16:16.282070 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.281471 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5ced465f-4a51-4441-b363-efac6c32deb0-host-slash\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.282070 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.281860 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/69d58ea3-6f19-4956-a782-6313891c2513-etc-tuned\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.282070 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.281943 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5ced465f-4a51-4441-b363-efac6c32deb0-ovnkube-script-lib\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.282767 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.282101 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/006eba5f-c69e-415e-b993-2a2c72ae4df3-agent-certs\") pod \"konnectivity-agent-c9r6q\" (UID: \"006eba5f-c69e-415e-b993-2a2c72ae4df3\") " pod="kube-system/konnectivity-agent-c9r6q" Apr 28 19:16:16.285057 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.284976 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ced465f-4a51-4441-b363-efac6c32deb0-ovn-node-metrics-cert\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.285681 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.285659 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx8x6\" (UniqueName: \"kubernetes.io/projected/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-kube-api-access-lx8x6\") pod \"network-metrics-daemon-txdd9\" (UID: \"2c344b2c-cf71-45b1-9143-e86be8d1b7b5\") " pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:16.286706 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.286684 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/69d58ea3-6f19-4956-a782-6313891c2513-tmp\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.287107 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.287085 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4tt7\" (UniqueName: \"kubernetes.io/projected/69d58ea3-6f19-4956-a782-6313891c2513-kube-api-access-r4tt7\") pod \"tuned-9k5fw\" (UID: \"69d58ea3-6f19-4956-a782-6313891c2513\") " pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.288808 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.288784 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvnkl\" (UniqueName: \"kubernetes.io/projected/0daae916-5659-44ea-96b4-ed96cbfa9da3-kube-api-access-cvnkl\") pod \"node-ca-g4cj8\" (UID: \"0daae916-5659-44ea-96b4-ed96cbfa9da3\") " pod="openshift-image-registry/node-ca-g4cj8" Apr 28 19:16:16.288911 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.288822 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqwd9\" (UniqueName: \"kubernetes.io/projected/5217b4ac-ec08-4f15-af88-99f26535e549-kube-api-access-tqwd9\") pod \"node-resolver-4kjrn\" (UID: \"5217b4ac-ec08-4f15-af88-99f26535e549\") " pod="openshift-dns/node-resolver-4kjrn" Apr 28 19:16:16.288911 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.288823 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr2rh\" (UniqueName: \"kubernetes.io/projected/a723c52e-a9e7-4b73-852d-22d1fd084cfa-kube-api-access-lr2rh\") pod \"iptables-alerter-sfcs7\" (UID: \"a723c52e-a9e7-4b73-852d-22d1fd084cfa\") " pod="openshift-network-operator/iptables-alerter-sfcs7" Apr 28 19:16:16.290953 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.290929 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc98d\" (UniqueName: \"kubernetes.io/projected/7e8df34b-a216-4c08-a88b-4c94b5d16b1c-kube-api-access-dc98d\") pod \"multus-nlr5m\" (UID: \"7e8df34b-a216-4c08-a88b-4c94b5d16b1c\") " pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.292997 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:16.292966 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:16.292997 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:16.292991 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:16.293153 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:16.293004 2539 projected.go:194] Error preparing data for projected volume kube-api-access-9ptnz for pod openshift-network-diagnostics/network-check-target-wg74q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:16.293153 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:16.293076 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz podName:dff9f9ea-63cc-4089-bb7e-e9fcb292c695 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:16.793054877 +0000 UTC m=+3.089410235 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9ptnz" (UniqueName: "kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz") pod "network-check-target-wg74q" (UID: "dff9f9ea-63cc-4089-bb7e-e9fcb292c695") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:16.295333 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.295303 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5qzz\" (UniqueName: \"kubernetes.io/projected/58213e63-9543-4438-bbbf-d242d52abc8f-kube-api-access-x5qzz\") pod \"multus-additional-cni-plugins-j682m\" (UID: \"58213e63-9543-4438-bbbf-d242d52abc8f\") " pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.296572 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.296553 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fr5g\" (UniqueName: \"kubernetes.io/projected/5ced465f-4a51-4441-b363-efac6c32deb0-kube-api-access-5fr5g\") pod \"ovnkube-node-977nw\" (UID: \"5ced465f-4a51-4441-b363-efac6c32deb0\") " pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.296774 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.296752 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6x98\" (UniqueName: \"kubernetes.io/projected/c64d6ced-de54-4be0-9661-bf00d68c4ce0-kube-api-access-g6x98\") pod \"aws-ebs-csi-driver-node-vkw4k\" (UID: \"c64d6ced-de54-4be0-9661-bf00d68c4ce0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.417445 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.417348 2539 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:16.462175 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.462141 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4kjrn" Apr 28 19:16:16.471190 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.471160 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-j682m" Apr 28 19:16:16.478894 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.478875 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nlr5m" Apr 28 19:16:16.483538 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.483515 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g4cj8" Apr 28 19:16:16.489057 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.489036 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-sfcs7" Apr 28 19:16:16.495684 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.495664 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:16.501230 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.501213 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-c9r6q" Apr 28 19:16:16.507440 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.507422 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" Apr 28 19:16:16.511989 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.511968 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" Apr 28 19:16:16.739432 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:16.739396 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod006eba5f_c69e_415e_b993_2a2c72ae4df3.slice/crio-5c9b16b4a555311cfffa438efb898583b563aa035547478a906bb5079393f49d WatchSource:0}: Error finding container 5c9b16b4a555311cfffa438efb898583b563aa035547478a906bb5079393f49d: Status 404 returned error can't find the container with id 5c9b16b4a555311cfffa438efb898583b563aa035547478a906bb5079393f49d Apr 28 19:16:16.740657 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:16.740597 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5217b4ac_ec08_4f15_af88_99f26535e549.slice/crio-72ab427911a56e63eb2ed08ee92cdedc1591415e6460022b7f4116c99356c29c WatchSource:0}: Error finding container 72ab427911a56e63eb2ed08ee92cdedc1591415e6460022b7f4116c99356c29c: Status 404 returned error can't find the container with id 72ab427911a56e63eb2ed08ee92cdedc1591415e6460022b7f4116c99356c29c Apr 28 19:16:16.741660 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:16.741638 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0daae916_5659_44ea_96b4_ed96cbfa9da3.slice/crio-d6112aff9274b54b71635e0ca8c4e18e83048f4063eb1b04933b98f982ac57b7 WatchSource:0}: Error finding container d6112aff9274b54b71635e0ca8c4e18e83048f4063eb1b04933b98f982ac57b7: Status 404 returned error can't find the container with id d6112aff9274b54b71635e0ca8c4e18e83048f4063eb1b04933b98f982ac57b7 Apr 28 19:16:16.742796 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:16.742692 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58213e63_9543_4438_bbbf_d242d52abc8f.slice/crio-25da6f1ac889d49592f3bd13b2e3081ee7da0561e2acaa80281f932f7481390d WatchSource:0}: Error finding container 25da6f1ac889d49592f3bd13b2e3081ee7da0561e2acaa80281f932f7481390d: Status 404 returned error can't find the container with id 25da6f1ac889d49592f3bd13b2e3081ee7da0561e2acaa80281f932f7481390d Apr 28 19:16:16.743736 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:16.743714 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc64d6ced_de54_4be0_9661_bf00d68c4ce0.slice/crio-4a77318099d11734adf1e05c73ceb33de8c3f8d176626dd86c9c6946993ff255 WatchSource:0}: Error finding container 4a77318099d11734adf1e05c73ceb33de8c3f8d176626dd86c9c6946993ff255: Status 404 returned error can't find the container with id 4a77318099d11734adf1e05c73ceb33de8c3f8d176626dd86c9c6946993ff255 Apr 28 19:16:16.744302 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:16.744260 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e8df34b_a216_4c08_a88b_4c94b5d16b1c.slice/crio-6f7bf959e8043ae8e3f7553466016e9d4233fbb7fef62f229aa83f7b25d47a7c WatchSource:0}: Error finding container 6f7bf959e8043ae8e3f7553466016e9d4233fbb7fef62f229aa83f7b25d47a7c: Status 404 returned error can't find the container with id 6f7bf959e8043ae8e3f7553466016e9d4233fbb7fef62f229aa83f7b25d47a7c Apr 28 19:16:16.745850 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:16.745804 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda723c52e_a9e7_4b73_852d_22d1fd084cfa.slice/crio-f93ac28549a177ca37b4d93ce663b5a837de1480d3a057f77e07541b48f627cb WatchSource:0}: Error finding container f93ac28549a177ca37b4d93ce663b5a837de1480d3a057f77e07541b48f627cb: Status 404 returned error can't find the container with id f93ac28549a177ca37b4d93ce663b5a837de1480d3a057f77e07541b48f627cb Apr 28 19:16:16.747480 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:16.747457 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69d58ea3_6f19_4956_a782_6313891c2513.slice/crio-7f6257cc49986bc9ac74dd27fe92b6899b7a8ba9afe9f2577a533e9a0208f114 WatchSource:0}: Error finding container 7f6257cc49986bc9ac74dd27fe92b6899b7a8ba9afe9f2577a533e9a0208f114: Status 404 returned error can't find the container with id 7f6257cc49986bc9ac74dd27fe92b6899b7a8ba9afe9f2577a533e9a0208f114 Apr 28 19:16:16.751762 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:16.751671 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ced465f_4a51_4441_b363_efac6c32deb0.slice/crio-0665f0cc0143f863380c66cac201897978cc8168403e06db29e7ca235442ddfe WatchSource:0}: Error finding container 0665f0cc0143f863380c66cac201897978cc8168403e06db29e7ca235442ddfe: Status 404 returned error can't find the container with id 0665f0cc0143f863380c66cac201897978cc8168403e06db29e7ca235442ddfe Apr 28 19:16:16.785865 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.785721 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs\") pod \"network-metrics-daemon-txdd9\" (UID: \"2c344b2c-cf71-45b1-9143-e86be8d1b7b5\") " pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:16.785950 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:16.785866 2539 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:16.785950 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:16.785923 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs podName:2c344b2c-cf71-45b1-9143-e86be8d1b7b5 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:17.785907902 +0000 UTC m=+4.082263257 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs") pod "network-metrics-daemon-txdd9" (UID: "2c344b2c-cf71-45b1-9143-e86be8d1b7b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:16.887263 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:16.887223 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ptnz\" (UniqueName: \"kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz\") pod \"network-check-target-wg74q\" (UID: \"dff9f9ea-63cc-4089-bb7e-e9fcb292c695\") " pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:16.887459 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:16.887349 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:16.887459 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:16.887366 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:16.887459 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:16.887391 2539 projected.go:194] Error preparing data for projected volume kube-api-access-9ptnz for pod openshift-network-diagnostics/network-check-target-wg74q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:16.887459 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:16.887448 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz podName:dff9f9ea-63cc-4089-bb7e-e9fcb292c695 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:17.887429835 +0000 UTC m=+4.183785178 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-9ptnz" (UniqueName: "kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz") pod "network-check-target-wg74q" (UID: "dff9f9ea-63cc-4089-bb7e-e9fcb292c695") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:17.208906 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:17.208426 2539 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-27 19:11:15 +0000 UTC" deadline="2027-11-28 10:38:33.219333109 +0000 UTC" Apr 28 19:16:17.208906 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:17.208466 2539 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13887h22m16.010871492s" Apr 28 19:16:17.275893 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:17.275828 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j682m" event={"ID":"58213e63-9543-4438-bbbf-d242d52abc8f","Type":"ContainerStarted","Data":"25da6f1ac889d49592f3bd13b2e3081ee7da0561e2acaa80281f932f7481390d"} Apr 28 19:16:17.293144 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:17.293049 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4kjrn" event={"ID":"5217b4ac-ec08-4f15-af88-99f26535e549","Type":"ContainerStarted","Data":"72ab427911a56e63eb2ed08ee92cdedc1591415e6460022b7f4116c99356c29c"} Apr 28 19:16:17.299490 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:17.299457 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-206.ec2.internal" event={"ID":"6d10770512520c6fc076a2363490adc7","Type":"ContainerStarted","Data":"472a72c676c5d66e8871fa35da99f957b2008e7c0fb51c6cd2f8da5d742db729"} Apr 28 19:16:17.303753 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:17.303672 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" event={"ID":"69d58ea3-6f19-4956-a782-6313891c2513","Type":"ContainerStarted","Data":"7f6257cc49986bc9ac74dd27fe92b6899b7a8ba9afe9f2577a533e9a0208f114"} Apr 28 19:16:17.307385 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:17.307340 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-sfcs7" event={"ID":"a723c52e-a9e7-4b73-852d-22d1fd084cfa","Type":"ContainerStarted","Data":"f93ac28549a177ca37b4d93ce663b5a837de1480d3a057f77e07541b48f627cb"} Apr 28 19:16:17.316443 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:17.313055 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nlr5m" event={"ID":"7e8df34b-a216-4c08-a88b-4c94b5d16b1c","Type":"ContainerStarted","Data":"6f7bf959e8043ae8e3f7553466016e9d4233fbb7fef62f229aa83f7b25d47a7c"} Apr 28 19:16:17.329238 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:17.329163 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" event={"ID":"c64d6ced-de54-4be0-9661-bf00d68c4ce0","Type":"ContainerStarted","Data":"4a77318099d11734adf1e05c73ceb33de8c3f8d176626dd86c9c6946993ff255"} Apr 28 19:16:17.333874 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:17.333840 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g4cj8" event={"ID":"0daae916-5659-44ea-96b4-ed96cbfa9da3","Type":"ContainerStarted","Data":"d6112aff9274b54b71635e0ca8c4e18e83048f4063eb1b04933b98f982ac57b7"} Apr 28 19:16:17.338534 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:17.338506 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-c9r6q" event={"ID":"006eba5f-c69e-415e-b993-2a2c72ae4df3","Type":"ContainerStarted","Data":"5c9b16b4a555311cfffa438efb898583b563aa035547478a906bb5079393f49d"} Apr 28 19:16:17.347086 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:17.347056 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-977nw" event={"ID":"5ced465f-4a51-4441-b363-efac6c32deb0","Type":"ContainerStarted","Data":"0665f0cc0143f863380c66cac201897978cc8168403e06db29e7ca235442ddfe"} Apr 28 19:16:17.798157 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:17.798070 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs\") pod \"network-metrics-daemon-txdd9\" (UID: \"2c344b2c-cf71-45b1-9143-e86be8d1b7b5\") " pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:17.798325 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:17.798236 2539 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:17.798325 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:17.798318 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs podName:2c344b2c-cf71-45b1-9143-e86be8d1b7b5 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:19.798294824 +0000 UTC m=+6.094650167 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs") pod "network-metrics-daemon-txdd9" (UID: "2c344b2c-cf71-45b1-9143-e86be8d1b7b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:17.898795 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:17.898754 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ptnz\" (UniqueName: \"kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz\") pod \"network-check-target-wg74q\" (UID: \"dff9f9ea-63cc-4089-bb7e-e9fcb292c695\") " pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:17.898979 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:17.898911 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:17.898979 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:17.898932 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:17.898979 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:17.898945 2539 projected.go:194] Error preparing data for projected volume kube-api-access-9ptnz for pod openshift-network-diagnostics/network-check-target-wg74q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:17.899134 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:17.899002 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz podName:dff9f9ea-63cc-4089-bb7e-e9fcb292c695 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:19.8989833 +0000 UTC m=+6.195338647 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-9ptnz" (UniqueName: "kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz") pod "network-check-target-wg74q" (UID: "dff9f9ea-63cc-4089-bb7e-e9fcb292c695") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:18.059099 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:18.058745 2539 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:18.270055 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:18.270017 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:18.270510 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:18.270142 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wg74q" podUID="dff9f9ea-63cc-4089-bb7e-e9fcb292c695" Apr 28 19:16:18.272583 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:18.270654 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:18.272583 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:18.270758 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-txdd9" podUID="2c344b2c-cf71-45b1-9143-e86be8d1b7b5" Apr 28 19:16:18.367558 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:18.366627 2539 generic.go:358] "Generic (PLEG): container finished" podID="e5713c9e782ac5c4d7eb091b097c3dde" containerID="849111fedb92efcbe3fe6c5e71e0014a7e729c724e3a06ac2df3d33bd9bce4d2" exitCode=0 Apr 28 19:16:18.367708 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:18.367573 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-206.ec2.internal" event={"ID":"e5713c9e782ac5c4d7eb091b097c3dde","Type":"ContainerDied","Data":"849111fedb92efcbe3fe6c5e71e0014a7e729c724e3a06ac2df3d33bd9bce4d2"} Apr 28 19:16:18.387063 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:18.386112 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-206.ec2.internal" podStartSLOduration=3.3860926559999998 podStartE2EDuration="3.386092656s" podCreationTimestamp="2026-04-28 19:16:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:17.322587416 +0000 UTC m=+3.618942782" watchObservedRunningTime="2026-04-28 19:16:18.386092656 +0000 UTC m=+4.682448020" Apr 28 19:16:19.375681 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:19.375635 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-206.ec2.internal" event={"ID":"e5713c9e782ac5c4d7eb091b097c3dde","Type":"ContainerStarted","Data":"23bab3e1923c93012829a01f5d239da6305e9a7502f18c798693970faa80a444"} Apr 28 19:16:19.392751 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:19.392696 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-206.ec2.internal" podStartSLOduration=4.392676562 podStartE2EDuration="4.392676562s" podCreationTimestamp="2026-04-28 19:16:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:19.391297753 +0000 UTC m=+5.687653118" watchObservedRunningTime="2026-04-28 19:16:19.392676562 +0000 UTC m=+5.689031927" Apr 28 19:16:19.811360 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:19.810710 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs\") pod \"network-metrics-daemon-txdd9\" (UID: \"2c344b2c-cf71-45b1-9143-e86be8d1b7b5\") " pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:19.811360 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:19.810882 2539 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:19.811360 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:19.810948 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs podName:2c344b2c-cf71-45b1-9143-e86be8d1b7b5 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:23.810927592 +0000 UTC m=+10.107282938 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs") pod "network-metrics-daemon-txdd9" (UID: "2c344b2c-cf71-45b1-9143-e86be8d1b7b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:19.911618 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:19.911575 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ptnz\" (UniqueName: \"kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz\") pod \"network-check-target-wg74q\" (UID: \"dff9f9ea-63cc-4089-bb7e-e9fcb292c695\") " pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:19.911800 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:19.911763 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:19.911800 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:19.911790 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:19.911910 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:19.911804 2539 projected.go:194] Error preparing data for projected volume kube-api-access-9ptnz for pod openshift-network-diagnostics/network-check-target-wg74q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:19.911910 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:19.911868 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz podName:dff9f9ea-63cc-4089-bb7e-e9fcb292c695 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:23.911848676 +0000 UTC m=+10.208204033 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-9ptnz" (UniqueName: "kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz") pod "network-check-target-wg74q" (UID: "dff9f9ea-63cc-4089-bb7e-e9fcb292c695") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:20.267512 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:20.267030 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:20.267512 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:20.267157 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wg74q" podUID="dff9f9ea-63cc-4089-bb7e-e9fcb292c695" Apr 28 19:16:20.267733 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:20.267570 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:20.267733 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:20.267668 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-txdd9" podUID="2c344b2c-cf71-45b1-9143-e86be8d1b7b5" Apr 28 19:16:22.267565 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:22.267529 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:22.268078 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:22.267679 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-txdd9" podUID="2c344b2c-cf71-45b1-9143-e86be8d1b7b5" Apr 28 19:16:22.268078 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:22.267761 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:22.268078 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:22.267875 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wg74q" podUID="dff9f9ea-63cc-4089-bb7e-e9fcb292c695" Apr 28 19:16:23.842532 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:23.842497 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs\") pod \"network-metrics-daemon-txdd9\" (UID: \"2c344b2c-cf71-45b1-9143-e86be8d1b7b5\") " pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:23.842976 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:23.842680 2539 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:23.842976 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:23.842754 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs podName:2c344b2c-cf71-45b1-9143-e86be8d1b7b5 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:31.842734674 +0000 UTC m=+18.139090017 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs") pod "network-metrics-daemon-txdd9" (UID: "2c344b2c-cf71-45b1-9143-e86be8d1b7b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:23.943560 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:23.943509 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ptnz\" (UniqueName: \"kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz\") pod \"network-check-target-wg74q\" (UID: \"dff9f9ea-63cc-4089-bb7e-e9fcb292c695\") " pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:23.943741 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:23.943715 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:23.943804 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:23.943741 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:23.943804 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:23.943757 2539 projected.go:194] Error preparing data for projected volume kube-api-access-9ptnz for pod openshift-network-diagnostics/network-check-target-wg74q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:23.943900 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:23.943827 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz podName:dff9f9ea-63cc-4089-bb7e-e9fcb292c695 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:31.943806765 +0000 UTC m=+18.240162126 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-9ptnz" (UniqueName: "kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz") pod "network-check-target-wg74q" (UID: "dff9f9ea-63cc-4089-bb7e-e9fcb292c695") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:24.168155 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:24.167339 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-b4vxf"] Apr 28 19:16:24.170942 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:24.170470 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:24.170942 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:24.170553 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b4vxf" podUID="5325db29-356b-4407-92e1-5ad3950aa605" Apr 28 19:16:24.246480 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:24.246354 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret\") pod \"global-pull-secret-syncer-b4vxf\" (UID: \"5325db29-356b-4407-92e1-5ad3950aa605\") " pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:24.246668 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:24.246521 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5325db29-356b-4407-92e1-5ad3950aa605-dbus\") pod \"global-pull-secret-syncer-b4vxf\" (UID: \"5325db29-356b-4407-92e1-5ad3950aa605\") " pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:24.246668 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:24.246558 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5325db29-356b-4407-92e1-5ad3950aa605-kubelet-config\") pod \"global-pull-secret-syncer-b4vxf\" (UID: \"5325db29-356b-4407-92e1-5ad3950aa605\") " pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:24.267581 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:24.267547 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:24.267742 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:24.267650 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wg74q" podUID="dff9f9ea-63cc-4089-bb7e-e9fcb292c695" Apr 28 19:16:24.267742 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:24.267678 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:24.267858 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:24.267813 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-txdd9" podUID="2c344b2c-cf71-45b1-9143-e86be8d1b7b5" Apr 28 19:16:24.348510 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:24.347659 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret\") pod \"global-pull-secret-syncer-b4vxf\" (UID: \"5325db29-356b-4407-92e1-5ad3950aa605\") " pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:24.348510 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:24.347725 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5325db29-356b-4407-92e1-5ad3950aa605-dbus\") pod \"global-pull-secret-syncer-b4vxf\" (UID: \"5325db29-356b-4407-92e1-5ad3950aa605\") " pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:24.348510 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:24.347753 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5325db29-356b-4407-92e1-5ad3950aa605-kubelet-config\") pod \"global-pull-secret-syncer-b4vxf\" (UID: \"5325db29-356b-4407-92e1-5ad3950aa605\") " pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:24.348510 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:24.347867 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5325db29-356b-4407-92e1-5ad3950aa605-kubelet-config\") pod \"global-pull-secret-syncer-b4vxf\" (UID: \"5325db29-356b-4407-92e1-5ad3950aa605\") " pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:24.348510 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:24.348004 2539 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:24.348510 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:24.348066 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret podName:5325db29-356b-4407-92e1-5ad3950aa605 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:24.848046397 +0000 UTC m=+11.144401761 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret") pod "global-pull-secret-syncer-b4vxf" (UID: "5325db29-356b-4407-92e1-5ad3950aa605") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:24.348510 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:24.348457 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5325db29-356b-4407-92e1-5ad3950aa605-dbus\") pod \"global-pull-secret-syncer-b4vxf\" (UID: \"5325db29-356b-4407-92e1-5ad3950aa605\") " pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:24.851180 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:24.851122 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret\") pod \"global-pull-secret-syncer-b4vxf\" (UID: \"5325db29-356b-4407-92e1-5ad3950aa605\") " pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:24.851707 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:24.851310 2539 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:24.851707 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:24.851408 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret podName:5325db29-356b-4407-92e1-5ad3950aa605 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:25.851385562 +0000 UTC m=+12.147740924 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret") pod "global-pull-secret-syncer-b4vxf" (UID: "5325db29-356b-4407-92e1-5ad3950aa605") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:25.858013 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:25.857971 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret\") pod \"global-pull-secret-syncer-b4vxf\" (UID: \"5325db29-356b-4407-92e1-5ad3950aa605\") " pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:25.858472 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:25.858138 2539 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:25.858472 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:25.858220 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret podName:5325db29-356b-4407-92e1-5ad3950aa605 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:27.858196925 +0000 UTC m=+14.154552268 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret") pod "global-pull-secret-syncer-b4vxf" (UID: "5325db29-356b-4407-92e1-5ad3950aa605") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:26.266583 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:26.266547 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:26.266747 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:26.266547 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:26.266747 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:26.266672 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wg74q" podUID="dff9f9ea-63cc-4089-bb7e-e9fcb292c695" Apr 28 19:16:26.266849 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:26.266547 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:26.266849 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:26.266749 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-txdd9" podUID="2c344b2c-cf71-45b1-9143-e86be8d1b7b5" Apr 28 19:16:26.266957 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:26.266843 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b4vxf" podUID="5325db29-356b-4407-92e1-5ad3950aa605" Apr 28 19:16:27.870610 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:27.870571 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret\") pod \"global-pull-secret-syncer-b4vxf\" (UID: \"5325db29-356b-4407-92e1-5ad3950aa605\") " pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:27.871032 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:27.870692 2539 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:27.871032 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:27.870758 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret podName:5325db29-356b-4407-92e1-5ad3950aa605 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:31.8707405 +0000 UTC m=+18.167095844 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret") pod "global-pull-secret-syncer-b4vxf" (UID: "5325db29-356b-4407-92e1-5ad3950aa605") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:28.266884 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:28.266801 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:28.266884 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:28.266825 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:28.266884 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:28.266808 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:28.267122 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:28.266991 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-txdd9" podUID="2c344b2c-cf71-45b1-9143-e86be8d1b7b5" Apr 28 19:16:28.267122 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:28.267031 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b4vxf" podUID="5325db29-356b-4407-92e1-5ad3950aa605" Apr 28 19:16:28.267213 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:28.267129 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wg74q" podUID="dff9f9ea-63cc-4089-bb7e-e9fcb292c695" Apr 28 19:16:30.266781 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:30.266743 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:30.267218 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:30.266743 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:30.267218 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:30.266873 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wg74q" podUID="dff9f9ea-63cc-4089-bb7e-e9fcb292c695" Apr 28 19:16:30.267218 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:30.266743 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:30.267218 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:30.266956 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b4vxf" podUID="5325db29-356b-4407-92e1-5ad3950aa605" Apr 28 19:16:30.267218 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:30.267040 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-txdd9" podUID="2c344b2c-cf71-45b1-9143-e86be8d1b7b5" Apr 28 19:16:31.898694 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:31.898655 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs\") pod \"network-metrics-daemon-txdd9\" (UID: \"2c344b2c-cf71-45b1-9143-e86be8d1b7b5\") " pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:31.898694 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:31.898695 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret\") pod \"global-pull-secret-syncer-b4vxf\" (UID: \"5325db29-356b-4407-92e1-5ad3950aa605\") " pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:31.899111 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:31.898803 2539 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:31.899111 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:31.898862 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs podName:2c344b2c-cf71-45b1-9143-e86be8d1b7b5 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:47.898848399 +0000 UTC m=+34.195203754 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs") pod "network-metrics-daemon-txdd9" (UID: "2c344b2c-cf71-45b1-9143-e86be8d1b7b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:31.899111 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:31.898803 2539 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:31.899111 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:31.898934 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret podName:5325db29-356b-4407-92e1-5ad3950aa605 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:39.898922127 +0000 UTC m=+26.195277472 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret") pod "global-pull-secret-syncer-b4vxf" (UID: "5325db29-356b-4407-92e1-5ad3950aa605") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:31.999674 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:31.999637 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ptnz\" (UniqueName: \"kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz\") pod \"network-check-target-wg74q\" (UID: \"dff9f9ea-63cc-4089-bb7e-e9fcb292c695\") " pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:31.999832 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:31.999769 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:31.999832 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:31.999791 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:31.999832 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:31.999803 2539 projected.go:194] Error preparing data for projected volume kube-api-access-9ptnz for pod openshift-network-diagnostics/network-check-target-wg74q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:31.999968 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:31.999857 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz podName:dff9f9ea-63cc-4089-bb7e-e9fcb292c695 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:47.999843871 +0000 UTC m=+34.296199213 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-9ptnz" (UniqueName: "kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz") pod "network-check-target-wg74q" (UID: "dff9f9ea-63cc-4089-bb7e-e9fcb292c695") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:32.269559 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:32.269483 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:32.269559 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:32.269521 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:32.269559 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:32.269562 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:32.269769 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:32.269664 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-txdd9" podUID="2c344b2c-cf71-45b1-9143-e86be8d1b7b5" Apr 28 19:16:32.269818 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:32.269783 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wg74q" podUID="dff9f9ea-63cc-4089-bb7e-e9fcb292c695" Apr 28 19:16:32.269881 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:32.269862 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b4vxf" podUID="5325db29-356b-4407-92e1-5ad3950aa605" Apr 28 19:16:34.267884 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.267535 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:34.268620 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.267627 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:34.268620 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:34.267975 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wg74q" podUID="dff9f9ea-63cc-4089-bb7e-e9fcb292c695" Apr 28 19:16:34.268620 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.267667 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:34.268620 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:34.268043 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b4vxf" podUID="5325db29-356b-4407-92e1-5ad3950aa605" Apr 28 19:16:34.268620 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:34.268134 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-txdd9" podUID="2c344b2c-cf71-45b1-9143-e86be8d1b7b5" Apr 28 19:16:34.401529 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.401481 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-c9r6q" event={"ID":"006eba5f-c69e-415e-b993-2a2c72ae4df3","Type":"ContainerStarted","Data":"4d6ec8fc6f21f0be54dc460ff5d650a4cc915d0c0c15555806181d577e064fdc"} Apr 28 19:16:34.404171 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.404147 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 19:16:34.404514 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.404490 2539 generic.go:358] "Generic (PLEG): container finished" podID="5ced465f-4a51-4441-b363-efac6c32deb0" containerID="b418f95ef791b22f9439745c5e7dd6dd6eea85b4931f957058cbbad866937480" exitCode=1 Apr 28 19:16:34.404633 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.404565 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-977nw" event={"ID":"5ced465f-4a51-4441-b363-efac6c32deb0","Type":"ContainerStarted","Data":"9969985768f02330e10feb364c1e31eef7dfb211be90e1565f5db8b0bdd58ecc"} Apr 28 19:16:34.404633 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.404590 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-977nw" event={"ID":"5ced465f-4a51-4441-b363-efac6c32deb0","Type":"ContainerStarted","Data":"6fb2a40b32043610ede6f148804a1ee2a88102e7c6432b13d916822d26a7f0bc"} Apr 28 19:16:34.404633 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.404599 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-977nw" event={"ID":"5ced465f-4a51-4441-b363-efac6c32deb0","Type":"ContainerStarted","Data":"f5ed573743876f6fc7185899f4f47d6c0e3375faa2d38c82b6b7e5b455b196a0"} Apr 28 19:16:34.404633 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.404607 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-977nw" event={"ID":"5ced465f-4a51-4441-b363-efac6c32deb0","Type":"ContainerStarted","Data":"f3b76a495d19727edc7be7e8d2e788f220d063042fd8c5a1de00f7c0ddd78567"} Apr 28 19:16:34.404633 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.404615 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-977nw" event={"ID":"5ced465f-4a51-4441-b363-efac6c32deb0","Type":"ContainerDied","Data":"b418f95ef791b22f9439745c5e7dd6dd6eea85b4931f957058cbbad866937480"} Apr 28 19:16:34.404633 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.404630 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-977nw" event={"ID":"5ced465f-4a51-4441-b363-efac6c32deb0","Type":"ContainerStarted","Data":"56b906e4327cfb054dabd5374e404e5c761d494ef128cb4cddf6e70f753c75b9"} Apr 28 19:16:34.406021 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.406000 2539 generic.go:358] "Generic (PLEG): container finished" podID="58213e63-9543-4438-bbbf-d242d52abc8f" containerID="c461ca595f820b13abc8733b3e26725e51a65357868f233b0cfe040caa2b3822" exitCode=0 Apr 28 19:16:34.406107 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.406060 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j682m" event={"ID":"58213e63-9543-4438-bbbf-d242d52abc8f","Type":"ContainerDied","Data":"c461ca595f820b13abc8733b3e26725e51a65357868f233b0cfe040caa2b3822"} Apr 28 19:16:34.407328 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.407305 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4kjrn" event={"ID":"5217b4ac-ec08-4f15-af88-99f26535e549","Type":"ContainerStarted","Data":"4b124a82fa6da9c7b58c621a8474487774293f16ba4f28ebf4bb12373833acab"} Apr 28 19:16:34.408462 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.408442 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" event={"ID":"69d58ea3-6f19-4956-a782-6313891c2513","Type":"ContainerStarted","Data":"c0f33740ce586d1f4d75073560107031bd1ddd9acd5d551909747a32ae6301c9"} Apr 28 19:16:34.409752 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.409730 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nlr5m" event={"ID":"7e8df34b-a216-4c08-a88b-4c94b5d16b1c","Type":"ContainerStarted","Data":"5900eb5b4bdb5d68bfd71e59fb62ac1a7ebdbb5e5392420e7e6bc58330123d64"} Apr 28 19:16:34.410820 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.410799 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" event={"ID":"c64d6ced-de54-4be0-9661-bf00d68c4ce0","Type":"ContainerStarted","Data":"61a3ad47a767ddd94489d0c8622feeaecf847afd8a3d7dcacabc2025a4609e70"} Apr 28 19:16:34.411987 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.411966 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g4cj8" event={"ID":"0daae916-5659-44ea-96b4-ed96cbfa9da3","Type":"ContainerStarted","Data":"9f4e146ca73073c09f776ffcc33ec74760d4e85de7f0e8511e87a05ef23f8b0a"} Apr 28 19:16:34.420304 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.420265 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-c9r6q" podStartSLOduration=3.748762925 podStartE2EDuration="20.420253529s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="2026-04-28 19:16:16.741075425 +0000 UTC m=+3.037430766" lastFinishedPulling="2026-04-28 19:16:33.412566013 +0000 UTC m=+19.708921370" observedRunningTime="2026-04-28 19:16:34.419880199 +0000 UTC m=+20.716235563" watchObservedRunningTime="2026-04-28 19:16:34.420253529 +0000 UTC m=+20.716608894" Apr 28 19:16:34.437729 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.437648 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9k5fw" podStartSLOduration=3.77677072 podStartE2EDuration="20.43763462s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="2026-04-28 19:16:16.751838044 +0000 UTC m=+3.048193387" lastFinishedPulling="2026-04-28 19:16:33.412701936 +0000 UTC m=+19.709057287" observedRunningTime="2026-04-28 19:16:34.437344856 +0000 UTC m=+20.733700221" watchObservedRunningTime="2026-04-28 19:16:34.43763462 +0000 UTC m=+20.733989985" Apr 28 19:16:34.452417 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.452359 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4kjrn" podStartSLOduration=3.781980257 podStartE2EDuration="20.452346203s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="2026-04-28 19:16:16.742252622 +0000 UTC m=+3.038607965" lastFinishedPulling="2026-04-28 19:16:33.412618556 +0000 UTC m=+19.708973911" observedRunningTime="2026-04-28 19:16:34.452295664 +0000 UTC m=+20.748651040" watchObservedRunningTime="2026-04-28 19:16:34.452346203 +0000 UTC m=+20.748701567" Apr 28 19:16:34.512497 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.512441 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-g4cj8" podStartSLOduration=3.843074483 podStartE2EDuration="20.512423217s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="2026-04-28 19:16:16.743612608 +0000 UTC m=+3.039967951" lastFinishedPulling="2026-04-28 19:16:33.412961329 +0000 UTC m=+19.709316685" observedRunningTime="2026-04-28 19:16:34.493212718 +0000 UTC m=+20.789568082" watchObservedRunningTime="2026-04-28 19:16:34.512423217 +0000 UTC m=+20.808778581" Apr 28 19:16:34.513075 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:34.513035 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nlr5m" podStartSLOduration=3.809198898 podStartE2EDuration="20.513027818s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="2026-04-28 19:16:16.746827563 +0000 UTC m=+3.043182905" lastFinishedPulling="2026-04-28 19:16:33.450656473 +0000 UTC m=+19.747011825" observedRunningTime="2026-04-28 19:16:34.51257489 +0000 UTC m=+20.808930255" watchObservedRunningTime="2026-04-28 19:16:34.513027818 +0000 UTC m=+20.809383194" Apr 28 19:16:35.271638 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:35.271593 2539 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 28 19:16:35.414631 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:35.414553 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-sfcs7" event={"ID":"a723c52e-a9e7-4b73-852d-22d1fd084cfa","Type":"ContainerStarted","Data":"e3e4638e787ecbfabeb50ec02fb728754d82af3ba40d70093a114dcbad565e50"} Apr 28 19:16:35.416139 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:35.416110 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" event={"ID":"c64d6ced-de54-4be0-9661-bf00d68c4ce0","Type":"ContainerStarted","Data":"dc652365f3c4ab1c1f327f733ce1a2ed5e0c9205cc3e7e5d03f7e2b1578d4a48"} Apr 28 19:16:35.430572 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:35.430520 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-sfcs7" podStartSLOduration=4.766823729 podStartE2EDuration="21.430504302s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="2026-04-28 19:16:16.749336199 +0000 UTC m=+3.045691553" lastFinishedPulling="2026-04-28 19:16:33.413016774 +0000 UTC m=+19.709372126" observedRunningTime="2026-04-28 19:16:35.429823983 +0000 UTC m=+21.726179374" watchObservedRunningTime="2026-04-28 19:16:35.430504302 +0000 UTC m=+21.726859666" Apr 28 19:16:36.243446 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:36.243312 2539 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-28T19:16:35.271630305Z","UUID":"007f2d69-fe83-4874-8261-49fbbdc4bb6a","Handler":null,"Name":"","Endpoint":""} Apr 28 19:16:36.246695 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:36.246669 2539 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 28 19:16:36.246695 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:36.246701 2539 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 28 19:16:36.267227 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:36.267196 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:36.267401 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:36.267235 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:36.267401 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:36.267205 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:36.267401 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:36.267321 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wg74q" podUID="dff9f9ea-63cc-4089-bb7e-e9fcb292c695" Apr 28 19:16:36.267562 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:36.267442 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-txdd9" podUID="2c344b2c-cf71-45b1-9143-e86be8d1b7b5" Apr 28 19:16:36.267562 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:36.267520 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b4vxf" podUID="5325db29-356b-4407-92e1-5ad3950aa605" Apr 28 19:16:37.167384 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:37.167087 2539 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-c9r6q" Apr 28 19:16:37.167863 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:37.167843 2539 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-c9r6q" Apr 28 19:16:37.232798 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:37.232765 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-c9r6q" Apr 28 19:16:37.233451 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:37.233433 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-c9r6q" Apr 28 19:16:37.422107 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:37.422028 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" event={"ID":"c64d6ced-de54-4be0-9661-bf00d68c4ce0","Type":"ContainerStarted","Data":"e3f86822341cfd815341583de6fd118112360a78b1ea835b9528831fd89b9b51"} Apr 28 19:16:37.424996 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:37.424968 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 19:16:37.425365 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:37.425335 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-977nw" event={"ID":"5ced465f-4a51-4441-b363-efac6c32deb0","Type":"ContainerStarted","Data":"e74aaa3f14d164277f976cb8cac53d3576187e659e364b8e6fbe45186e31313a"} Apr 28 19:16:37.442781 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:37.442738 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vkw4k" podStartSLOduration=3.762178409 podStartE2EDuration="23.442724522s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="2026-04-28 19:16:16.746431878 +0000 UTC m=+3.042787220" lastFinishedPulling="2026-04-28 19:16:36.426977991 +0000 UTC m=+22.723333333" observedRunningTime="2026-04-28 19:16:37.441876663 +0000 UTC m=+23.738232038" watchObservedRunningTime="2026-04-28 19:16:37.442724522 +0000 UTC m=+23.739079886" Apr 28 19:16:38.267614 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:38.267585 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:38.268099 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:38.267585 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:38.268099 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:38.267713 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b4vxf" podUID="5325db29-356b-4407-92e1-5ad3950aa605" Apr 28 19:16:38.268099 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:38.267585 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:38.268099 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:38.267826 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-txdd9" podUID="2c344b2c-cf71-45b1-9143-e86be8d1b7b5" Apr 28 19:16:38.268099 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:38.267866 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wg74q" podUID="dff9f9ea-63cc-4089-bb7e-e9fcb292c695" Apr 28 19:16:39.432893 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:39.432705 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 19:16:39.433700 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:39.433210 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-977nw" event={"ID":"5ced465f-4a51-4441-b363-efac6c32deb0","Type":"ContainerStarted","Data":"5f8de976540b2c79072e72bdd233f746070f0a97a31700d25321c5bdc86933d7"} Apr 28 19:16:39.433700 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:39.433593 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:39.433817 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:39.433801 2539 scope.go:117] "RemoveContainer" containerID="b418f95ef791b22f9439745c5e7dd6dd6eea85b4931f957058cbbad866937480" Apr 28 19:16:39.435087 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:39.435066 2539 generic.go:358] "Generic (PLEG): container finished" podID="58213e63-9543-4438-bbbf-d242d52abc8f" containerID="92accd5fcfed82d839bc2232c9cee41b9cb8d4d1807615595a4754d04afd55c1" exitCode=0 Apr 28 19:16:39.435218 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:39.435099 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j682m" event={"ID":"58213e63-9543-4438-bbbf-d242d52abc8f","Type":"ContainerDied","Data":"92accd5fcfed82d839bc2232c9cee41b9cb8d4d1807615595a4754d04afd55c1"} Apr 28 19:16:39.449635 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:39.449613 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:39.959899 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:39.959861 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret\") pod \"global-pull-secret-syncer-b4vxf\" (UID: \"5325db29-356b-4407-92e1-5ad3950aa605\") " pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:39.960058 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:39.960003 2539 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:39.960102 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:39.960068 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret podName:5325db29-356b-4407-92e1-5ad3950aa605 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:55.960053098 +0000 UTC m=+42.256408440 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret") pod "global-pull-secret-syncer-b4vxf" (UID: "5325db29-356b-4407-92e1-5ad3950aa605") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:40.267353 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:40.267308 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:40.267353 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:40.267335 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:40.267353 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:40.267352 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:40.267641 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:40.267468 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wg74q" podUID="dff9f9ea-63cc-4089-bb7e-e9fcb292c695" Apr 28 19:16:40.267641 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:40.267582 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-txdd9" podUID="2c344b2c-cf71-45b1-9143-e86be8d1b7b5" Apr 28 19:16:40.267737 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:40.267673 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b4vxf" podUID="5325db29-356b-4407-92e1-5ad3950aa605" Apr 28 19:16:40.440023 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:40.439998 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 19:16:40.440400 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:40.440281 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-977nw" event={"ID":"5ced465f-4a51-4441-b363-efac6c32deb0","Type":"ContainerStarted","Data":"b24677ae452f14998a5378da6453566de6c01bd945547187fb6e5f0eb6a0095a"} Apr 28 19:16:40.440521 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:40.440508 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:40.440576 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:40.440529 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:40.454081 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:40.454054 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:16:40.469997 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:40.469948 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-977nw" podStartSLOduration=9.585806147 podStartE2EDuration="26.469935552s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="2026-04-28 19:16:16.753562196 +0000 UTC m=+3.049917538" lastFinishedPulling="2026-04-28 19:16:33.637691585 +0000 UTC m=+19.934046943" observedRunningTime="2026-04-28 19:16:40.469350631 +0000 UTC m=+26.765706018" watchObservedRunningTime="2026-04-28 19:16:40.469935552 +0000 UTC m=+26.766290915" Apr 28 19:16:41.443910 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:41.443874 2539 generic.go:358] "Generic (PLEG): container finished" podID="58213e63-9543-4438-bbbf-d242d52abc8f" containerID="4ab6e48bc8266b93899dda4e6e56767dcc0695d9b5ccc1dd545e2380b5408968" exitCode=0 Apr 28 19:16:41.444325 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:41.443958 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j682m" event={"ID":"58213e63-9543-4438-bbbf-d242d52abc8f","Type":"ContainerDied","Data":"4ab6e48bc8266b93899dda4e6e56767dcc0695d9b5ccc1dd545e2380b5408968"} Apr 28 19:16:42.267244 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:42.267213 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:42.267433 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:42.267316 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b4vxf" podUID="5325db29-356b-4407-92e1-5ad3950aa605" Apr 28 19:16:42.267433 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:42.267325 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:42.267433 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:42.267346 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:42.267564 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:42.267452 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wg74q" podUID="dff9f9ea-63cc-4089-bb7e-e9fcb292c695" Apr 28 19:16:42.267564 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:42.267516 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-txdd9" podUID="2c344b2c-cf71-45b1-9143-e86be8d1b7b5" Apr 28 19:16:43.448938 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:43.448901 2539 generic.go:358] "Generic (PLEG): container finished" podID="58213e63-9543-4438-bbbf-d242d52abc8f" containerID="7d24527f1fda0bee6a8e976bab29dac1fff2f9df16c784b33ef0621c6fa5332d" exitCode=0 Apr 28 19:16:43.449296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:43.448949 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j682m" event={"ID":"58213e63-9543-4438-bbbf-d242d52abc8f","Type":"ContainerDied","Data":"7d24527f1fda0bee6a8e976bab29dac1fff2f9df16c784b33ef0621c6fa5332d"} Apr 28 19:16:44.267679 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:44.267646 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:44.267907 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:44.267740 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wg74q" podUID="dff9f9ea-63cc-4089-bb7e-e9fcb292c695" Apr 28 19:16:44.267907 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:44.267770 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:44.268053 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:44.267898 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-txdd9" podUID="2c344b2c-cf71-45b1-9143-e86be8d1b7b5" Apr 28 19:16:44.268053 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:44.267934 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:44.268053 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:44.268015 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b4vxf" podUID="5325db29-356b-4407-92e1-5ad3950aa605" Apr 28 19:16:46.267072 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:46.267033 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:46.267646 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:46.267044 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:46.267646 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:46.267163 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wg74q" podUID="dff9f9ea-63cc-4089-bb7e-e9fcb292c695" Apr 28 19:16:46.267646 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:46.267045 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:46.267646 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:46.267259 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-txdd9" podUID="2c344b2c-cf71-45b1-9143-e86be8d1b7b5" Apr 28 19:16:46.267646 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:46.267305 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b4vxf" podUID="5325db29-356b-4407-92e1-5ad3950aa605" Apr 28 19:16:47.923926 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:47.923888 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs\") pod \"network-metrics-daemon-txdd9\" (UID: \"2c344b2c-cf71-45b1-9143-e86be8d1b7b5\") " pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:47.924553 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:47.924052 2539 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:47.924553 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:47.924129 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs podName:2c344b2c-cf71-45b1-9143-e86be8d1b7b5 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:19.924107479 +0000 UTC m=+66.220462835 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs") pod "network-metrics-daemon-txdd9" (UID: "2c344b2c-cf71-45b1-9143-e86be8d1b7b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:48.024347 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:48.024307 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ptnz\" (UniqueName: \"kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz\") pod \"network-check-target-wg74q\" (UID: \"dff9f9ea-63cc-4089-bb7e-e9fcb292c695\") " pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:48.024544 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:48.024455 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:48.024544 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:48.024480 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:48.024544 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:48.024493 2539 projected.go:194] Error preparing data for projected volume kube-api-access-9ptnz for pod openshift-network-diagnostics/network-check-target-wg74q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:48.024544 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:48.024545 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz podName:dff9f9ea-63cc-4089-bb7e-e9fcb292c695 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:20.024532126 +0000 UTC m=+66.320887472 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-9ptnz" (UniqueName: "kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz") pod "network-check-target-wg74q" (UID: "dff9f9ea-63cc-4089-bb7e-e9fcb292c695") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:48.267265 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:48.267225 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:48.267475 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:48.267273 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:48.267475 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:48.267327 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:48.267475 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:48.267438 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b4vxf" podUID="5325db29-356b-4407-92e1-5ad3950aa605" Apr 28 19:16:48.267635 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:48.267576 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wg74q" podUID="dff9f9ea-63cc-4089-bb7e-e9fcb292c695" Apr 28 19:16:48.267692 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:48.267674 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-txdd9" podUID="2c344b2c-cf71-45b1-9143-e86be8d1b7b5" Apr 28 19:16:50.266708 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:50.266675 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:50.267115 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:50.266675 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:50.267115 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:50.266782 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b4vxf" podUID="5325db29-356b-4407-92e1-5ad3950aa605" Apr 28 19:16:50.267115 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:50.266675 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:50.267115 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:50.266870 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-txdd9" podUID="2c344b2c-cf71-45b1-9143-e86be8d1b7b5" Apr 28 19:16:50.267115 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:50.266947 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wg74q" podUID="dff9f9ea-63cc-4089-bb7e-e9fcb292c695" Apr 28 19:16:50.467124 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:50.467093 2539 generic.go:358] "Generic (PLEG): container finished" podID="58213e63-9543-4438-bbbf-d242d52abc8f" containerID="fa5f11e6577ee305baa876dccad4153b6c8ee8112f3a8c5697edf4b9af3524a1" exitCode=0 Apr 28 19:16:50.467274 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:50.467138 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j682m" event={"ID":"58213e63-9543-4438-bbbf-d242d52abc8f","Type":"ContainerDied","Data":"fa5f11e6577ee305baa876dccad4153b6c8ee8112f3a8c5697edf4b9af3524a1"} Apr 28 19:16:51.471897 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:51.471862 2539 generic.go:358] "Generic (PLEG): container finished" podID="58213e63-9543-4438-bbbf-d242d52abc8f" containerID="b00d13e2e1b0b51a3dba702903c93765a07019f7b90a43c434ffd343339a566a" exitCode=0 Apr 28 19:16:51.472294 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:51.471945 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j682m" event={"ID":"58213e63-9543-4438-bbbf-d242d52abc8f","Type":"ContainerDied","Data":"b00d13e2e1b0b51a3dba702903c93765a07019f7b90a43c434ffd343339a566a"} Apr 28 19:16:52.267350 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:52.267310 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:52.267350 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:52.267310 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:52.267576 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:52.267458 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:52.267576 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:52.267476 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-txdd9" podUID="2c344b2c-cf71-45b1-9143-e86be8d1b7b5" Apr 28 19:16:52.267576 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:52.267534 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b4vxf" podUID="5325db29-356b-4407-92e1-5ad3950aa605" Apr 28 19:16:52.267695 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:52.267606 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wg74q" podUID="dff9f9ea-63cc-4089-bb7e-e9fcb292c695" Apr 28 19:16:52.476900 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:52.476861 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j682m" event={"ID":"58213e63-9543-4438-bbbf-d242d52abc8f","Type":"ContainerStarted","Data":"cefaccb16ed5c3656f3f5453f2a2470f43592b0f1108825ceb0aa926144c42ab"} Apr 28 19:16:52.536946 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:52.536837 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-j682m" podStartSLOduration=5.984488124 podStartE2EDuration="38.536824454s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="2026-04-28 19:16:16.745991703 +0000 UTC m=+3.042347046" lastFinishedPulling="2026-04-28 19:16:49.298328021 +0000 UTC m=+35.594683376" observedRunningTime="2026-04-28 19:16:52.536448066 +0000 UTC m=+38.832803429" watchObservedRunningTime="2026-04-28 19:16:52.536824454 +0000 UTC m=+38.833179817" Apr 28 19:16:53.745514 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:53.745322 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-b4vxf"] Apr 28 19:16:53.745931 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:53.745618 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:53.745931 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:53.745703 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b4vxf" podUID="5325db29-356b-4407-92e1-5ad3950aa605" Apr 28 19:16:53.750684 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:53.750648 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-txdd9"] Apr 28 19:16:53.751046 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:53.751026 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:53.751310 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:53.751286 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-txdd9" podUID="2c344b2c-cf71-45b1-9143-e86be8d1b7b5" Apr 28 19:16:53.751572 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:53.751545 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-wg74q"] Apr 28 19:16:53.751823 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:53.751803 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:53.751950 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:53.751916 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wg74q" podUID="dff9f9ea-63cc-4089-bb7e-e9fcb292c695" Apr 28 19:16:55.267090 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:55.267043 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:55.267090 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:55.267079 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:55.267090 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:55.267108 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:55.267715 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:55.267181 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wg74q" podUID="dff9f9ea-63cc-4089-bb7e-e9fcb292c695" Apr 28 19:16:55.267715 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:55.267222 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b4vxf" podUID="5325db29-356b-4407-92e1-5ad3950aa605" Apr 28 19:16:55.267715 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:55.267302 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-txdd9" podUID="2c344b2c-cf71-45b1-9143-e86be8d1b7b5" Apr 28 19:16:55.986084 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:55.985987 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret\") pod \"global-pull-secret-syncer-b4vxf\" (UID: \"5325db29-356b-4407-92e1-5ad3950aa605\") " pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:55.986234 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:55.986126 2539 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:55.986234 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:55.986185 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret podName:5325db29-356b-4407-92e1-5ad3950aa605 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:27.986172029 +0000 UTC m=+74.282527375 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret") pod "global-pull-secret-syncer-b4vxf" (UID: "5325db29-356b-4407-92e1-5ad3950aa605") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:57.267300 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.267273 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:57.267701 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.267273 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:57.267701 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:57.267370 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-b4vxf" podUID="5325db29-356b-4407-92e1-5ad3950aa605" Apr 28 19:16:57.267701 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.267273 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:57.267701 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:57.267469 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-txdd9" podUID="2c344b2c-cf71-45b1-9143-e86be8d1b7b5" Apr 28 19:16:57.267701 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:16:57.267525 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wg74q" podUID="dff9f9ea-63cc-4089-bb7e-e9fcb292c695" Apr 28 19:16:57.563182 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.563112 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-206.ec2.internal" event="NodeReady" Apr 28 19:16:57.563317 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.563230 2539 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 28 19:16:57.608875 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.608838 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-85f54f8846-b82cj"] Apr 28 19:16:57.612211 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.612196 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.617675 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.617650 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-b9m64\"" Apr 28 19:16:57.617804 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.617663 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 28 19:16:57.617804 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.617698 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 28 19:16:57.620168 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.620151 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 28 19:16:57.622813 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.622798 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 28 19:16:57.630650 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.630630 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-85f54f8846-b82cj"] Apr 28 19:16:57.637575 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.637555 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-d2nhx"] Apr 28 19:16:57.641186 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.641173 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-d2nhx" Apr 28 19:16:57.645930 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.645910 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 28 19:16:57.646194 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.646181 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 28 19:16:57.646234 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.646207 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xdh6h\"" Apr 28 19:16:57.658251 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.658228 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-d2nhx"] Apr 28 19:16:57.674727 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.674629 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-78qnv"] Apr 28 19:16:57.680449 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.680427 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-78qnv" Apr 28 19:16:57.683335 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.683312 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 28 19:16:57.683335 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.683327 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-tpf57\"" Apr 28 19:16:57.683525 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.683344 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 28 19:16:57.683639 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.683626 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 28 19:16:57.684739 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.684725 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 28 19:16:57.691645 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.691627 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-78qnv"] Apr 28 19:16:57.747747 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.747711 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-hl4tm"] Apr 28 19:16:57.751432 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.751418 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hl4tm" Apr 28 19:16:57.754511 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.754490 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 28 19:16:57.754638 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.754522 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 28 19:16:57.754638 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.754539 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qbgzj\"" Apr 28 19:16:57.754905 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.754892 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 28 19:16:57.762293 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.762273 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hl4tm"] Apr 28 19:16:57.797685 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.797651 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/44e01525-a6e2-451a-8b52-51306d0ab16f-bound-sa-token\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.797685 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.797684 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlx4h\" (UniqueName: \"kubernetes.io/projected/f8173256-d810-4483-b373-4b19f554cbf6-kube-api-access-vlx4h\") pod \"dns-default-d2nhx\" (UID: \"f8173256-d810-4483-b373-4b19f554cbf6\") " pod="openshift-dns/dns-default-d2nhx" Apr 28 19:16:57.797878 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.797704 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5c7a7b2d-077e-4f5f-96e5-571525a4f600-crio-socket\") pod \"insights-runtime-extractor-78qnv\" (UID: \"5c7a7b2d-077e-4f5f-96e5-571525a4f600\") " pod="openshift-insights/insights-runtime-extractor-78qnv" Apr 28 19:16:57.797878 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.797725 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwhm5\" (UniqueName: \"kubernetes.io/projected/5c7a7b2d-077e-4f5f-96e5-571525a4f600-kube-api-access-rwhm5\") pod \"insights-runtime-extractor-78qnv\" (UID: \"5c7a7b2d-077e-4f5f-96e5-571525a4f600\") " pod="openshift-insights/insights-runtime-extractor-78qnv" Apr 28 19:16:57.797878 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.797808 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8173256-d810-4483-b373-4b19f554cbf6-metrics-tls\") pod \"dns-default-d2nhx\" (UID: \"f8173256-d810-4483-b373-4b19f554cbf6\") " pod="openshift-dns/dns-default-d2nhx" Apr 28 19:16:57.797878 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.797828 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5c7a7b2d-077e-4f5f-96e5-571525a4f600-data-volume\") pod \"insights-runtime-extractor-78qnv\" (UID: \"5c7a7b2d-077e-4f5f-96e5-571525a4f600\") " pod="openshift-insights/insights-runtime-extractor-78qnv" Apr 28 19:16:57.797878 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.797852 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5c7a7b2d-077e-4f5f-96e5-571525a4f600-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-78qnv\" (UID: \"5c7a7b2d-077e-4f5f-96e5-571525a4f600\") " pod="openshift-insights/insights-runtime-extractor-78qnv" Apr 28 19:16:57.798063 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.797898 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/44e01525-a6e2-451a-8b52-51306d0ab16f-ca-trust-extracted\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.798063 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.797925 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/44e01525-a6e2-451a-8b52-51306d0ab16f-registry-certificates\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.798063 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.797956 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbl54\" (UniqueName: \"kubernetes.io/projected/44e01525-a6e2-451a-8b52-51306d0ab16f-kube-api-access-mbl54\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.798063 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.797974 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8173256-d810-4483-b373-4b19f554cbf6-config-volume\") pod \"dns-default-d2nhx\" (UID: \"f8173256-d810-4483-b373-4b19f554cbf6\") " pod="openshift-dns/dns-default-d2nhx" Apr 28 19:16:57.798063 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.797997 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5c7a7b2d-077e-4f5f-96e5-571525a4f600-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-78qnv\" (UID: \"5c7a7b2d-077e-4f5f-96e5-571525a4f600\") " pod="openshift-insights/insights-runtime-extractor-78qnv" Apr 28 19:16:57.798063 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.798038 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/44e01525-a6e2-451a-8b52-51306d0ab16f-trusted-ca\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.798063 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.798056 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/44e01525-a6e2-451a-8b52-51306d0ab16f-image-registry-private-configuration\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.798284 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.798072 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/44e01525-a6e2-451a-8b52-51306d0ab16f-registry-tls\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.798284 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.798097 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/44e01525-a6e2-451a-8b52-51306d0ab16f-installation-pull-secrets\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.798284 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.798116 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f8173256-d810-4483-b373-4b19f554cbf6-tmp-dir\") pod \"dns-default-d2nhx\" (UID: \"f8173256-d810-4483-b373-4b19f554cbf6\") " pod="openshift-dns/dns-default-d2nhx" Apr 28 19:16:57.898786 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.898694 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/44e01525-a6e2-451a-8b52-51306d0ab16f-bound-sa-token\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.898786 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.898732 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlx4h\" (UniqueName: \"kubernetes.io/projected/f8173256-d810-4483-b373-4b19f554cbf6-kube-api-access-vlx4h\") pod \"dns-default-d2nhx\" (UID: \"f8173256-d810-4483-b373-4b19f554cbf6\") " pod="openshift-dns/dns-default-d2nhx" Apr 28 19:16:57.898786 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.898759 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whml7\" (UniqueName: \"kubernetes.io/projected/f2e61248-2e6c-4f91-806d-ba6a148c3b71-kube-api-access-whml7\") pod \"ingress-canary-hl4tm\" (UID: \"f2e61248-2e6c-4f91-806d-ba6a148c3b71\") " pod="openshift-ingress-canary/ingress-canary-hl4tm" Apr 28 19:16:57.899064 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.898789 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5c7a7b2d-077e-4f5f-96e5-571525a4f600-crio-socket\") pod \"insights-runtime-extractor-78qnv\" (UID: \"5c7a7b2d-077e-4f5f-96e5-571525a4f600\") " pod="openshift-insights/insights-runtime-extractor-78qnv" Apr 28 19:16:57.899064 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.898813 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwhm5\" (UniqueName: \"kubernetes.io/projected/5c7a7b2d-077e-4f5f-96e5-571525a4f600-kube-api-access-rwhm5\") pod \"insights-runtime-extractor-78qnv\" (UID: \"5c7a7b2d-077e-4f5f-96e5-571525a4f600\") " pod="openshift-insights/insights-runtime-extractor-78qnv" Apr 28 19:16:57.899064 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.898850 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8173256-d810-4483-b373-4b19f554cbf6-metrics-tls\") pod \"dns-default-d2nhx\" (UID: \"f8173256-d810-4483-b373-4b19f554cbf6\") " pod="openshift-dns/dns-default-d2nhx" Apr 28 19:16:57.899064 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.898875 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5c7a7b2d-077e-4f5f-96e5-571525a4f600-data-volume\") pod \"insights-runtime-extractor-78qnv\" (UID: \"5c7a7b2d-077e-4f5f-96e5-571525a4f600\") " pod="openshift-insights/insights-runtime-extractor-78qnv" Apr 28 19:16:57.899064 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.898898 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5c7a7b2d-077e-4f5f-96e5-571525a4f600-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-78qnv\" (UID: \"5c7a7b2d-077e-4f5f-96e5-571525a4f600\") " pod="openshift-insights/insights-runtime-extractor-78qnv" Apr 28 19:16:57.899064 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.898926 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/44e01525-a6e2-451a-8b52-51306d0ab16f-ca-trust-extracted\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.899064 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.898948 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/44e01525-a6e2-451a-8b52-51306d0ab16f-registry-certificates\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.899064 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.898984 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbl54\" (UniqueName: \"kubernetes.io/projected/44e01525-a6e2-451a-8b52-51306d0ab16f-kube-api-access-mbl54\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.899064 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.899010 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8173256-d810-4483-b373-4b19f554cbf6-config-volume\") pod \"dns-default-d2nhx\" (UID: \"f8173256-d810-4483-b373-4b19f554cbf6\") " pod="openshift-dns/dns-default-d2nhx" Apr 28 19:16:57.899064 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.899012 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5c7a7b2d-077e-4f5f-96e5-571525a4f600-crio-socket\") pod \"insights-runtime-extractor-78qnv\" (UID: \"5c7a7b2d-077e-4f5f-96e5-571525a4f600\") " pod="openshift-insights/insights-runtime-extractor-78qnv" Apr 28 19:16:57.899064 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.899035 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5c7a7b2d-077e-4f5f-96e5-571525a4f600-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-78qnv\" (UID: \"5c7a7b2d-077e-4f5f-96e5-571525a4f600\") " pod="openshift-insights/insights-runtime-extractor-78qnv" Apr 28 19:16:57.899635 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.899093 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/44e01525-a6e2-451a-8b52-51306d0ab16f-trusted-ca\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.899635 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.899259 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/44e01525-a6e2-451a-8b52-51306d0ab16f-image-registry-private-configuration\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.899635 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.899289 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/44e01525-a6e2-451a-8b52-51306d0ab16f-registry-tls\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.899635 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.899343 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/44e01525-a6e2-451a-8b52-51306d0ab16f-installation-pull-secrets\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.899635 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.899392 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f8173256-d810-4483-b373-4b19f554cbf6-tmp-dir\") pod \"dns-default-d2nhx\" (UID: \"f8173256-d810-4483-b373-4b19f554cbf6\") " pod="openshift-dns/dns-default-d2nhx" Apr 28 19:16:57.899635 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.899419 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f2e61248-2e6c-4f91-806d-ba6a148c3b71-cert\") pod \"ingress-canary-hl4tm\" (UID: \"f2e61248-2e6c-4f91-806d-ba6a148c3b71\") " pod="openshift-ingress-canary/ingress-canary-hl4tm" Apr 28 19:16:57.899635 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.899437 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/44e01525-a6e2-451a-8b52-51306d0ab16f-ca-trust-extracted\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.899960 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.899926 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8173256-d810-4483-b373-4b19f554cbf6-config-volume\") pod \"dns-default-d2nhx\" (UID: \"f8173256-d810-4483-b373-4b19f554cbf6\") " pod="openshift-dns/dns-default-d2nhx" Apr 28 19:16:57.900023 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.899993 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/44e01525-a6e2-451a-8b52-51306d0ab16f-registry-certificates\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.900101 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.900080 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/44e01525-a6e2-451a-8b52-51306d0ab16f-trusted-ca\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.900159 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.900108 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f8173256-d810-4483-b373-4b19f554cbf6-tmp-dir\") pod \"dns-default-d2nhx\" (UID: \"f8173256-d810-4483-b373-4b19f554cbf6\") " pod="openshift-dns/dns-default-d2nhx" Apr 28 19:16:57.900906 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.900845 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5c7a7b2d-077e-4f5f-96e5-571525a4f600-data-volume\") pod \"insights-runtime-extractor-78qnv\" (UID: \"5c7a7b2d-077e-4f5f-96e5-571525a4f600\") " pod="openshift-insights/insights-runtime-extractor-78qnv" Apr 28 19:16:57.901194 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.901159 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5c7a7b2d-077e-4f5f-96e5-571525a4f600-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-78qnv\" (UID: \"5c7a7b2d-077e-4f5f-96e5-571525a4f600\") " pod="openshift-insights/insights-runtime-extractor-78qnv" Apr 28 19:16:57.903337 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.903308 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8173256-d810-4483-b373-4b19f554cbf6-metrics-tls\") pod \"dns-default-d2nhx\" (UID: \"f8173256-d810-4483-b373-4b19f554cbf6\") " pod="openshift-dns/dns-default-d2nhx" Apr 28 19:16:57.903452 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.903337 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5c7a7b2d-077e-4f5f-96e5-571525a4f600-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-78qnv\" (UID: \"5c7a7b2d-077e-4f5f-96e5-571525a4f600\") " pod="openshift-insights/insights-runtime-extractor-78qnv" Apr 28 19:16:57.903452 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.903391 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/44e01525-a6e2-451a-8b52-51306d0ab16f-image-registry-private-configuration\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.903452 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.903405 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/44e01525-a6e2-451a-8b52-51306d0ab16f-installation-pull-secrets\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.903452 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.903442 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/44e01525-a6e2-451a-8b52-51306d0ab16f-registry-tls\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.906722 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.906698 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/44e01525-a6e2-451a-8b52-51306d0ab16f-bound-sa-token\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.907676 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.907653 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbl54\" (UniqueName: \"kubernetes.io/projected/44e01525-a6e2-451a-8b52-51306d0ab16f-kube-api-access-mbl54\") pod \"image-registry-85f54f8846-b82cj\" (UID: \"44e01525-a6e2-451a-8b52-51306d0ab16f\") " pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.908218 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.908191 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwhm5\" (UniqueName: \"kubernetes.io/projected/5c7a7b2d-077e-4f5f-96e5-571525a4f600-kube-api-access-rwhm5\") pod \"insights-runtime-extractor-78qnv\" (UID: \"5c7a7b2d-077e-4f5f-96e5-571525a4f600\") " pod="openshift-insights/insights-runtime-extractor-78qnv" Apr 28 19:16:57.908293 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.908262 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlx4h\" (UniqueName: \"kubernetes.io/projected/f8173256-d810-4483-b373-4b19f554cbf6-kube-api-access-vlx4h\") pod \"dns-default-d2nhx\" (UID: \"f8173256-d810-4483-b373-4b19f554cbf6\") " pod="openshift-dns/dns-default-d2nhx" Apr 28 19:16:57.922179 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.922147 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:57.949149 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.948990 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-d2nhx" Apr 28 19:16:57.989013 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:57.988985 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-78qnv" Apr 28 19:16:58.000403 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:58.000263 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f2e61248-2e6c-4f91-806d-ba6a148c3b71-cert\") pod \"ingress-canary-hl4tm\" (UID: \"f2e61248-2e6c-4f91-806d-ba6a148c3b71\") " pod="openshift-ingress-canary/ingress-canary-hl4tm" Apr 28 19:16:58.000403 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:58.000315 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whml7\" (UniqueName: \"kubernetes.io/projected/f2e61248-2e6c-4f91-806d-ba6a148c3b71-kube-api-access-whml7\") pod \"ingress-canary-hl4tm\" (UID: \"f2e61248-2e6c-4f91-806d-ba6a148c3b71\") " pod="openshift-ingress-canary/ingress-canary-hl4tm" Apr 28 19:16:58.006734 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:58.006699 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f2e61248-2e6c-4f91-806d-ba6a148c3b71-cert\") pod \"ingress-canary-hl4tm\" (UID: \"f2e61248-2e6c-4f91-806d-ba6a148c3b71\") " pod="openshift-ingress-canary/ingress-canary-hl4tm" Apr 28 19:16:58.009706 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:58.009649 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-whml7\" (UniqueName: \"kubernetes.io/projected/f2e61248-2e6c-4f91-806d-ba6a148c3b71-kube-api-access-whml7\") pod \"ingress-canary-hl4tm\" (UID: \"f2e61248-2e6c-4f91-806d-ba6a148c3b71\") " pod="openshift-ingress-canary/ingress-canary-hl4tm" Apr 28 19:16:58.059973 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:58.059913 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hl4tm" Apr 28 19:16:58.068018 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:58.067959 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-85f54f8846-b82cj"] Apr 28 19:16:58.072903 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:58.072866 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44e01525_a6e2_451a_8b52_51306d0ab16f.slice/crio-6e3867841035cf8e00512c747faec5ed3d671e454b0da5a474aa73c8cf0226fe WatchSource:0}: Error finding container 6e3867841035cf8e00512c747faec5ed3d671e454b0da5a474aa73c8cf0226fe: Status 404 returned error can't find the container with id 6e3867841035cf8e00512c747faec5ed3d671e454b0da5a474aa73c8cf0226fe Apr 28 19:16:58.083605 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:58.083583 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-d2nhx"] Apr 28 19:16:58.089641 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:58.089593 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8173256_d810_4483_b373_4b19f554cbf6.slice/crio-32176c435d821bbcf757d742b9688a26aceb1a3ff82f06da305bcad2140dbf79 WatchSource:0}: Error finding container 32176c435d821bbcf757d742b9688a26aceb1a3ff82f06da305bcad2140dbf79: Status 404 returned error can't find the container with id 32176c435d821bbcf757d742b9688a26aceb1a3ff82f06da305bcad2140dbf79 Apr 28 19:16:58.139569 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:58.138416 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-78qnv"] Apr 28 19:16:58.149352 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:58.149253 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c7a7b2d_077e_4f5f_96e5_571525a4f600.slice/crio-cdeddc77bef3e76e16f5fde891d5a8afdde19f8fc2d728e861734917df805830 WatchSource:0}: Error finding container cdeddc77bef3e76e16f5fde891d5a8afdde19f8fc2d728e861734917df805830: Status 404 returned error can't find the container with id cdeddc77bef3e76e16f5fde891d5a8afdde19f8fc2d728e861734917df805830 Apr 28 19:16:58.209012 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:58.208984 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hl4tm"] Apr 28 19:16:58.213709 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:16:58.213674 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2e61248_2e6c_4f91_806d_ba6a148c3b71.slice/crio-cdfd9c65301550ed2b62b0e0b69ce53b8fbfedac43e60241275c336ebe6fc18b WatchSource:0}: Error finding container cdfd9c65301550ed2b62b0e0b69ce53b8fbfedac43e60241275c336ebe6fc18b: Status 404 returned error can't find the container with id cdfd9c65301550ed2b62b0e0b69ce53b8fbfedac43e60241275c336ebe6fc18b Apr 28 19:16:58.488512 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:58.488435 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hl4tm" event={"ID":"f2e61248-2e6c-4f91-806d-ba6a148c3b71","Type":"ContainerStarted","Data":"cdfd9c65301550ed2b62b0e0b69ce53b8fbfedac43e60241275c336ebe6fc18b"} Apr 28 19:16:58.489443 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:58.489409 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d2nhx" event={"ID":"f8173256-d810-4483-b373-4b19f554cbf6","Type":"ContainerStarted","Data":"32176c435d821bbcf757d742b9688a26aceb1a3ff82f06da305bcad2140dbf79"} Apr 28 19:16:58.490730 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:58.490705 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-78qnv" event={"ID":"5c7a7b2d-077e-4f5f-96e5-571525a4f600","Type":"ContainerStarted","Data":"0d1785c4a2dfc9b2f98185032805fd8f337cbd0b050e3038289dba93b2f747e6"} Apr 28 19:16:58.490851 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:58.490737 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-78qnv" event={"ID":"5c7a7b2d-077e-4f5f-96e5-571525a4f600","Type":"ContainerStarted","Data":"cdeddc77bef3e76e16f5fde891d5a8afdde19f8fc2d728e861734917df805830"} Apr 28 19:16:58.492019 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:58.491986 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85f54f8846-b82cj" event={"ID":"44e01525-a6e2-451a-8b52-51306d0ab16f","Type":"ContainerStarted","Data":"a65e9046e8428ad94649576482aa083d7dbbfe24c285280eb4eaf3945e90c9a6"} Apr 28 19:16:58.492122 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:58.492035 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85f54f8846-b82cj" event={"ID":"44e01525-a6e2-451a-8b52-51306d0ab16f","Type":"ContainerStarted","Data":"6e3867841035cf8e00512c747faec5ed3d671e454b0da5a474aa73c8cf0226fe"} Apr 28 19:16:58.492185 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:58.492121 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:16:58.511831 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:58.511643 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-85f54f8846-b82cj" podStartSLOduration=5.51162821 podStartE2EDuration="5.51162821s" podCreationTimestamp="2026-04-28 19:16:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:58.510717626 +0000 UTC m=+44.807072991" watchObservedRunningTime="2026-04-28 19:16:58.51162821 +0000 UTC m=+44.807983594" Apr 28 19:16:59.266780 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:59.266742 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:16:59.266780 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:59.266773 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:16:59.267083 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:59.266832 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:16:59.270802 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:59.270750 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 28 19:16:59.270954 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:59.270833 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 28 19:16:59.270954 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:59.270854 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 28 19:16:59.270954 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:59.270904 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-n7p66\"" Apr 28 19:16:59.270954 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:59.270833 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6mdm8\"" Apr 28 19:16:59.271161 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:59.271066 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 28 19:16:59.497252 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:16:59.497205 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-78qnv" event={"ID":"5c7a7b2d-077e-4f5f-96e5-571525a4f600","Type":"ContainerStarted","Data":"28e9c1d0c603ed0bc0a755ca86bd33c090f8ad11ea0122796b5b0b64b6f83c97"} Apr 28 19:17:00.753978 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.753944 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-dxng4"] Apr 28 19:17:00.755872 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.755856 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.762014 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.761990 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 28 19:17:00.762629 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.762615 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 28 19:17:00.762759 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.762745 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 28 19:17:00.763092 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.762814 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 28 19:17:00.766723 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.766705 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 28 19:17:00.767284 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.767267 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 28 19:17:00.767366 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.767269 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-x74v7\"" Apr 28 19:17:00.823648 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.823542 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6af9d93a-8042-4a3f-a6d6-b7603c690151-node-exporter-tls\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.823648 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.823626 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6af9d93a-8042-4a3f-a6d6-b7603c690151-node-exporter-accelerators-collector-config\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.823831 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.823671 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6af9d93a-8042-4a3f-a6d6-b7603c690151-node-exporter-wtmp\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.823831 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.823744 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6af9d93a-8042-4a3f-a6d6-b7603c690151-root\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.823831 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.823772 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6af9d93a-8042-4a3f-a6d6-b7603c690151-sys\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.823831 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.823817 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6af9d93a-8042-4a3f-a6d6-b7603c690151-node-exporter-textfile\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.824020 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.823854 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6af9d93a-8042-4a3f-a6d6-b7603c690151-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.824020 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.823921 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6af9d93a-8042-4a3f-a6d6-b7603c690151-metrics-client-ca\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.824020 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.823964 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc2ws\" (UniqueName: \"kubernetes.io/projected/6af9d93a-8042-4a3f-a6d6-b7603c690151-kube-api-access-wc2ws\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.925181 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.925149 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6af9d93a-8042-4a3f-a6d6-b7603c690151-node-exporter-wtmp\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.925275 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.925210 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6af9d93a-8042-4a3f-a6d6-b7603c690151-root\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.925275 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.925239 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6af9d93a-8042-4a3f-a6d6-b7603c690151-sys\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.925275 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.925265 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6af9d93a-8042-4a3f-a6d6-b7603c690151-node-exporter-textfile\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.925423 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.925294 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6af9d93a-8042-4a3f-a6d6-b7603c690151-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.925423 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.925336 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6af9d93a-8042-4a3f-a6d6-b7603c690151-metrics-client-ca\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.925423 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.925353 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6af9d93a-8042-4a3f-a6d6-b7603c690151-node-exporter-wtmp\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.925423 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.925389 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wc2ws\" (UniqueName: \"kubernetes.io/projected/6af9d93a-8042-4a3f-a6d6-b7603c690151-kube-api-access-wc2ws\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.925616 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.925452 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6af9d93a-8042-4a3f-a6d6-b7603c690151-node-exporter-tls\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.925616 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.925535 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6af9d93a-8042-4a3f-a6d6-b7603c690151-node-exporter-accelerators-collector-config\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.926556 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.925806 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6af9d93a-8042-4a3f-a6d6-b7603c690151-node-exporter-textfile\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.926556 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.925819 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6af9d93a-8042-4a3f-a6d6-b7603c690151-root\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.926556 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.925864 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6af9d93a-8042-4a3f-a6d6-b7603c690151-sys\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.926556 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.926112 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6af9d93a-8042-4a3f-a6d6-b7603c690151-node-exporter-accelerators-collector-config\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.926556 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.926284 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6af9d93a-8042-4a3f-a6d6-b7603c690151-metrics-client-ca\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.926556 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:17:00.926396 2539 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 28 19:17:00.926556 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:17:00.926450 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6af9d93a-8042-4a3f-a6d6-b7603c690151-node-exporter-tls podName:6af9d93a-8042-4a3f-a6d6-b7603c690151 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:01.426431025 +0000 UTC m=+47.722786370 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/6af9d93a-8042-4a3f-a6d6-b7603c690151-node-exporter-tls") pod "node-exporter-dxng4" (UID: "6af9d93a-8042-4a3f-a6d6-b7603c690151") : secret "node-exporter-tls" not found Apr 28 19:17:00.930087 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.930065 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6af9d93a-8042-4a3f-a6d6-b7603c690151-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:00.936656 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:00.936632 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc2ws\" (UniqueName: \"kubernetes.io/projected/6af9d93a-8042-4a3f-a6d6-b7603c690151-kube-api-access-wc2ws\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:01.429859 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.429590 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6af9d93a-8042-4a3f-a6d6-b7603c690151-node-exporter-tls\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:01.432346 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.432320 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6af9d93a-8042-4a3f-a6d6-b7603c690151-node-exporter-tls\") pod \"node-exporter-dxng4\" (UID: \"6af9d93a-8042-4a3f-a6d6-b7603c690151\") " pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:01.506151 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.506119 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hl4tm" event={"ID":"f2e61248-2e6c-4f91-806d-ba6a148c3b71","Type":"ContainerStarted","Data":"3e11ce580af1785e1fd7a9af595e9e80ec9cb7a35e2fc91b640ede6565d9d4f4"} Apr 28 19:17:01.507672 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.507645 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d2nhx" event={"ID":"f8173256-d810-4483-b373-4b19f554cbf6","Type":"ContainerStarted","Data":"3e9a387fa282f4fd1eb8a98de865a8f0806a020ea8bfe1083ae2996848cd14a4"} Apr 28 19:17:01.507792 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.507677 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d2nhx" event={"ID":"f8173256-d810-4483-b373-4b19f554cbf6","Type":"ContainerStarted","Data":"0d8c5cdff4bfe2cd7d2a72a902649a420648848bf4562586662f6a79a072fa97"} Apr 28 19:17:01.507792 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.507769 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-d2nhx" Apr 28 19:17:01.509347 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.509324 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-78qnv" event={"ID":"5c7a7b2d-077e-4f5f-96e5-571525a4f600","Type":"ContainerStarted","Data":"02ccd45a415d479f69777a0bdd4b92fe58c605af43a0d089dbdecc09706afcac"} Apr 28 19:17:01.535973 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.535934 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-hl4tm" podStartSLOduration=1.971678367 podStartE2EDuration="4.535921702s" podCreationTimestamp="2026-04-28 19:16:57 +0000 UTC" firstStartedPulling="2026-04-28 19:16:58.215731573 +0000 UTC m=+44.512086915" lastFinishedPulling="2026-04-28 19:17:00.779974906 +0000 UTC m=+47.076330250" observedRunningTime="2026-04-28 19:17:01.535242669 +0000 UTC m=+47.831598033" watchObservedRunningTime="2026-04-28 19:17:01.535921702 +0000 UTC m=+47.832277067" Apr 28 19:17:01.556861 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.556803 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-78qnv" podStartSLOduration=1.9839328950000001 podStartE2EDuration="4.556787172s" podCreationTimestamp="2026-04-28 19:16:57 +0000 UTC" firstStartedPulling="2026-04-28 19:16:58.20398053 +0000 UTC m=+44.500335871" lastFinishedPulling="2026-04-28 19:17:00.776834803 +0000 UTC m=+47.073190148" observedRunningTime="2026-04-28 19:17:01.556574822 +0000 UTC m=+47.852930186" watchObservedRunningTime="2026-04-28 19:17:01.556787172 +0000 UTC m=+47.853142538" Apr 28 19:17:01.574665 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.574542 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-d2nhx" podStartSLOduration=1.891199676 podStartE2EDuration="4.574525866s" podCreationTimestamp="2026-04-28 19:16:57 +0000 UTC" firstStartedPulling="2026-04-28 19:16:58.09155796 +0000 UTC m=+44.387913317" lastFinishedPulling="2026-04-28 19:17:00.774884151 +0000 UTC m=+47.071239507" observedRunningTime="2026-04-28 19:17:01.574054381 +0000 UTC m=+47.870409747" watchObservedRunningTime="2026-04-28 19:17:01.574525866 +0000 UTC m=+47.870881227" Apr 28 19:17:01.665125 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.665085 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dxng4" Apr 28 19:17:01.672999 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:17:01.672969 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6af9d93a_8042_4a3f_a6d6_b7603c690151.slice/crio-edeae90ab1110756abb99d91ba0908e9fe2716da813ea6f162152095f4780a1b WatchSource:0}: Error finding container edeae90ab1110756abb99d91ba0908e9fe2716da813ea6f162152095f4780a1b: Status 404 returned error can't find the container with id edeae90ab1110756abb99d91ba0908e9fe2716da813ea6f162152095f4780a1b Apr 28 19:17:01.888217 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.888186 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 28 19:17:01.894155 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.894136 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:01.896773 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.896753 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 28 19:17:01.897290 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.897275 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 28 19:17:01.897290 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.897284 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 28 19:17:01.897493 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.897316 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 28 19:17:01.897576 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.897561 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 28 19:17:01.897635 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.897577 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 28 19:17:01.898084 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.898068 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 28 19:17:01.898328 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.898276 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-2gnvw\"" Apr 28 19:17:01.898405 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.898343 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 28 19:17:01.898575 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.898561 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 28 19:17:01.906776 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.906749 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 28 19:17:01.933107 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.933071 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6874f2b3-c20a-42c9-82f9-355f20028ad6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:01.933107 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.933112 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:01.933313 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.933133 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-web-config\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:01.933313 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.933186 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:01.933313 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.933253 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6874f2b3-c20a-42c9-82f9-355f20028ad6-config-out\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:01.933313 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.933272 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:01.933313 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.933302 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6874f2b3-c20a-42c9-82f9-355f20028ad6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:01.933521 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.933350 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:01.933521 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.933405 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz6gg\" (UniqueName: \"kubernetes.io/projected/6874f2b3-c20a-42c9-82f9-355f20028ad6-kube-api-access-zz6gg\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:01.933521 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.933426 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-config-volume\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:01.933521 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.933444 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6874f2b3-c20a-42c9-82f9-355f20028ad6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:01.933521 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.933471 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6874f2b3-c20a-42c9-82f9-355f20028ad6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:01.933521 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:01.933494 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.034015 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.033982 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.034203 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.034070 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6874f2b3-c20a-42c9-82f9-355f20028ad6-config-out\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.034203 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.034102 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.034203 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.034132 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6874f2b3-c20a-42c9-82f9-355f20028ad6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.034203 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.034161 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.034428 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:17:02.034253 2539 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 28 19:17:02.034428 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.034293 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zz6gg\" (UniqueName: \"kubernetes.io/projected/6874f2b3-c20a-42c9-82f9-355f20028ad6-kube-api-access-zz6gg\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.034428 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:17:02.034323 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-main-tls podName:6874f2b3-c20a-42c9-82f9-355f20028ad6 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:02.534302957 +0000 UTC m=+48.830658314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "6874f2b3-c20a-42c9-82f9-355f20028ad6") : secret "alertmanager-main-tls" not found Apr 28 19:17:02.034594 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.034423 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-config-volume\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.034594 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.034465 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6874f2b3-c20a-42c9-82f9-355f20028ad6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.034594 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.034496 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6874f2b3-c20a-42c9-82f9-355f20028ad6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.034594 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.034539 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.035137 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.035074 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6874f2b3-c20a-42c9-82f9-355f20028ad6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.035260 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.035135 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6874f2b3-c20a-42c9-82f9-355f20028ad6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.035260 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.035179 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.035260 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.035204 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-web-config\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.035985 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.035637 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6874f2b3-c20a-42c9-82f9-355f20028ad6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.036147 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.036063 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6874f2b3-c20a-42c9-82f9-355f20028ad6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.037400 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.037334 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6874f2b3-c20a-42c9-82f9-355f20028ad6-config-out\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.038432 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.038287 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-config-volume\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.038432 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.038299 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.038793 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.038713 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.038793 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.038729 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.039081 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.039038 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.039169 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.039130 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-web-config\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.039311 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.039290 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6874f2b3-c20a-42c9-82f9-355f20028ad6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.045093 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.045069 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz6gg\" (UniqueName: \"kubernetes.io/projected/6874f2b3-c20a-42c9-82f9-355f20028ad6-kube-api-access-zz6gg\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.513546 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.513508 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dxng4" event={"ID":"6af9d93a-8042-4a3f-a6d6-b7603c690151","Type":"ContainerStarted","Data":"edeae90ab1110756abb99d91ba0908e9fe2716da813ea6f162152095f4780a1b"} Apr 28 19:17:02.538549 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.538514 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.540923 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.540898 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.804537 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.804455 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:17:02.946599 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:02.946561 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 28 19:17:02.953416 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:17:02.953367 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6874f2b3_c20a_42c9_82f9_355f20028ad6.slice/crio-205482defc3e30ffa4bf50b524a92a7bc75cb74de69c7c3a3f8bf0672c360f7c WatchSource:0}: Error finding container 205482defc3e30ffa4bf50b524a92a7bc75cb74de69c7c3a3f8bf0672c360f7c: Status 404 returned error can't find the container with id 205482defc3e30ffa4bf50b524a92a7bc75cb74de69c7c3a3f8bf0672c360f7c Apr 28 19:17:03.516752 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:03.516562 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6874f2b3-c20a-42c9-82f9-355f20028ad6","Type":"ContainerStarted","Data":"205482defc3e30ffa4bf50b524a92a7bc75cb74de69c7c3a3f8bf0672c360f7c"} Apr 28 19:17:03.517791 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:03.517770 2539 generic.go:358] "Generic (PLEG): container finished" podID="6af9d93a-8042-4a3f-a6d6-b7603c690151" containerID="a599917eb62f9af4fb24414b5cd2cdcb0cb2a141d44346d81fd05184c5a19213" exitCode=0 Apr 28 19:17:03.517912 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:03.517801 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dxng4" event={"ID":"6af9d93a-8042-4a3f-a6d6-b7603c690151","Type":"ContainerDied","Data":"a599917eb62f9af4fb24414b5cd2cdcb0cb2a141d44346d81fd05184c5a19213"} Apr 28 19:17:04.522318 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:04.522283 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dxng4" event={"ID":"6af9d93a-8042-4a3f-a6d6-b7603c690151","Type":"ContainerStarted","Data":"408d448abc27c953b1c0ed9dd307e3c631a9e798029027e4d0f4064a871b3db5"} Apr 28 19:17:04.522318 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:04.522322 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dxng4" event={"ID":"6af9d93a-8042-4a3f-a6d6-b7603c690151","Type":"ContainerStarted","Data":"1ed50904b142115ed4aca5193221c264e78f7917ab368c3226eb88ed66e1780a"} Apr 28 19:17:04.542964 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:04.542914 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-dxng4" podStartSLOduration=3.399048566 podStartE2EDuration="4.542901475s" podCreationTimestamp="2026-04-28 19:17:00 +0000 UTC" firstStartedPulling="2026-04-28 19:17:01.675234951 +0000 UTC m=+47.971590292" lastFinishedPulling="2026-04-28 19:17:02.819087847 +0000 UTC m=+49.115443201" observedRunningTime="2026-04-28 19:17:04.541480641 +0000 UTC m=+50.837836004" watchObservedRunningTime="2026-04-28 19:17:04.542901475 +0000 UTC m=+50.839256830" Apr 28 19:17:05.077393 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.077347 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-f68884b84-56nbs"] Apr 28 19:17:05.082547 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.082526 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.084613 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.084588 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 28 19:17:05.084613 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.084607 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 28 19:17:05.085174 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.085148 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 28 19:17:05.085251 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.085157 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-pcsf6\"" Apr 28 19:17:05.085632 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.085615 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-9rpm4tmoqs4dd\"" Apr 28 19:17:05.085718 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.085665 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 28 19:17:05.094230 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.093686 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-f68884b84-56nbs"] Apr 28 19:17:05.161428 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.161389 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae1d9a61-7e7d-4030-a91a-57583b894f03-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-f68884b84-56nbs\" (UID: \"ae1d9a61-7e7d-4030-a91a-57583b894f03\") " pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.161591 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.161443 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ae1d9a61-7e7d-4030-a91a-57583b894f03-audit-log\") pod \"metrics-server-f68884b84-56nbs\" (UID: \"ae1d9a61-7e7d-4030-a91a-57583b894f03\") " pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.161591 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.161466 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ae1d9a61-7e7d-4030-a91a-57583b894f03-secret-metrics-server-client-certs\") pod \"metrics-server-f68884b84-56nbs\" (UID: \"ae1d9a61-7e7d-4030-a91a-57583b894f03\") " pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.161591 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.161561 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wzcw\" (UniqueName: \"kubernetes.io/projected/ae1d9a61-7e7d-4030-a91a-57583b894f03-kube-api-access-5wzcw\") pod \"metrics-server-f68884b84-56nbs\" (UID: \"ae1d9a61-7e7d-4030-a91a-57583b894f03\") " pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.161720 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.161594 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ae1d9a61-7e7d-4030-a91a-57583b894f03-secret-metrics-server-tls\") pod \"metrics-server-f68884b84-56nbs\" (UID: \"ae1d9a61-7e7d-4030-a91a-57583b894f03\") " pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.161720 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.161615 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ae1d9a61-7e7d-4030-a91a-57583b894f03-metrics-server-audit-profiles\") pod \"metrics-server-f68884b84-56nbs\" (UID: \"ae1d9a61-7e7d-4030-a91a-57583b894f03\") " pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.161720 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.161672 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae1d9a61-7e7d-4030-a91a-57583b894f03-client-ca-bundle\") pod \"metrics-server-f68884b84-56nbs\" (UID: \"ae1d9a61-7e7d-4030-a91a-57583b894f03\") " pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.262581 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.262542 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae1d9a61-7e7d-4030-a91a-57583b894f03-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-f68884b84-56nbs\" (UID: \"ae1d9a61-7e7d-4030-a91a-57583b894f03\") " pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.262743 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.262644 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ae1d9a61-7e7d-4030-a91a-57583b894f03-audit-log\") pod \"metrics-server-f68884b84-56nbs\" (UID: \"ae1d9a61-7e7d-4030-a91a-57583b894f03\") " pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.262743 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.262677 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ae1d9a61-7e7d-4030-a91a-57583b894f03-secret-metrics-server-client-certs\") pod \"metrics-server-f68884b84-56nbs\" (UID: \"ae1d9a61-7e7d-4030-a91a-57583b894f03\") " pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.262821 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.262745 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5wzcw\" (UniqueName: \"kubernetes.io/projected/ae1d9a61-7e7d-4030-a91a-57583b894f03-kube-api-access-5wzcw\") pod \"metrics-server-f68884b84-56nbs\" (UID: \"ae1d9a61-7e7d-4030-a91a-57583b894f03\") " pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.262821 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.262777 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ae1d9a61-7e7d-4030-a91a-57583b894f03-secret-metrics-server-tls\") pod \"metrics-server-f68884b84-56nbs\" (UID: \"ae1d9a61-7e7d-4030-a91a-57583b894f03\") " pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.262821 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.262810 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ae1d9a61-7e7d-4030-a91a-57583b894f03-metrics-server-audit-profiles\") pod \"metrics-server-f68884b84-56nbs\" (UID: \"ae1d9a61-7e7d-4030-a91a-57583b894f03\") " pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.262970 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.262867 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae1d9a61-7e7d-4030-a91a-57583b894f03-client-ca-bundle\") pod \"metrics-server-f68884b84-56nbs\" (UID: \"ae1d9a61-7e7d-4030-a91a-57583b894f03\") " pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.263086 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.263062 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ae1d9a61-7e7d-4030-a91a-57583b894f03-audit-log\") pod \"metrics-server-f68884b84-56nbs\" (UID: \"ae1d9a61-7e7d-4030-a91a-57583b894f03\") " pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.263328 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.263309 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae1d9a61-7e7d-4030-a91a-57583b894f03-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-f68884b84-56nbs\" (UID: \"ae1d9a61-7e7d-4030-a91a-57583b894f03\") " pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.263890 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.263862 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ae1d9a61-7e7d-4030-a91a-57583b894f03-metrics-server-audit-profiles\") pod \"metrics-server-f68884b84-56nbs\" (UID: \"ae1d9a61-7e7d-4030-a91a-57583b894f03\") " pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.265465 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.265445 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ae1d9a61-7e7d-4030-a91a-57583b894f03-secret-metrics-server-tls\") pod \"metrics-server-f68884b84-56nbs\" (UID: \"ae1d9a61-7e7d-4030-a91a-57583b894f03\") " pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.265582 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.265566 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae1d9a61-7e7d-4030-a91a-57583b894f03-client-ca-bundle\") pod \"metrics-server-f68884b84-56nbs\" (UID: \"ae1d9a61-7e7d-4030-a91a-57583b894f03\") " pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.265632 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.265611 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ae1d9a61-7e7d-4030-a91a-57583b894f03-secret-metrics-server-client-certs\") pod \"metrics-server-f68884b84-56nbs\" (UID: \"ae1d9a61-7e7d-4030-a91a-57583b894f03\") " pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.276223 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.276200 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wzcw\" (UniqueName: \"kubernetes.io/projected/ae1d9a61-7e7d-4030-a91a-57583b894f03-kube-api-access-5wzcw\") pod \"metrics-server-f68884b84-56nbs\" (UID: \"ae1d9a61-7e7d-4030-a91a-57583b894f03\") " pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.396614 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.396537 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:05.526321 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.526157 2539 generic.go:358] "Generic (PLEG): container finished" podID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerID="e6370d7c618000383dfa633e30a64355db73e776be76b775ad7d389ba022b378" exitCode=0 Apr 28 19:17:05.527038 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.526241 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6874f2b3-c20a-42c9-82f9-355f20028ad6","Type":"ContainerDied","Data":"e6370d7c618000383dfa633e30a64355db73e776be76b775ad7d389ba022b378"} Apr 28 19:17:05.528505 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:05.528484 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-f68884b84-56nbs"] Apr 28 19:17:05.542032 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:17:05.542006 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae1d9a61_7e7d_4030_a91a_57583b894f03.slice/crio-522e4fa58b23c4aeb5dc2fd4ef265c99c5d3d3a5879e2b0ce9b6407e2f6af412 WatchSource:0}: Error finding container 522e4fa58b23c4aeb5dc2fd4ef265c99c5d3d3a5879e2b0ce9b6407e2f6af412: Status 404 returned error can't find the container with id 522e4fa58b23c4aeb5dc2fd4ef265c99c5d3d3a5879e2b0ce9b6407e2f6af412 Apr 28 19:17:06.529991 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:06.529956 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-f68884b84-56nbs" event={"ID":"ae1d9a61-7e7d-4030-a91a-57583b894f03","Type":"ContainerStarted","Data":"522e4fa58b23c4aeb5dc2fd4ef265c99c5d3d3a5879e2b0ce9b6407e2f6af412"} Apr 28 19:17:07.054445 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.054413 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:17:07.091131 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.091101 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:17:07.091284 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.091256 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.094011 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.093985 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 28 19:17:07.094161 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.094082 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 28 19:17:07.094344 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.094327 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 28 19:17:07.094629 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.094590 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 28 19:17:07.094629 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.094606 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 28 19:17:07.094629 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.094613 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 28 19:17:07.094629 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.094632 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 28 19:17:07.094981 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.094835 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 28 19:17:07.095124 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.095093 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-lw4dl\"" Apr 28 19:17:07.095267 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.095243 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3vmqc2tmsh820\"" Apr 28 19:17:07.095402 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.095251 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 28 19:17:07.098519 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.098497 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 28 19:17:07.103235 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.103207 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 28 19:17:07.103604 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.103574 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 28 19:17:07.181547 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.181508 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khc4w\" (UniqueName: \"kubernetes.io/projected/aa8c3570-cf87-4d22-976e-06f4436ea317-kube-api-access-khc4w\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.181547 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.181558 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.181814 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.181595 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa8c3570-cf87-4d22-976e-06f4436ea317-config-out\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.181814 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.181644 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.181814 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.181671 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa8c3570-cf87-4d22-976e-06f4436ea317-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.181814 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.181694 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/aa8c3570-cf87-4d22-976e-06f4436ea317-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.181814 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.181758 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.181814 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.181805 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.182102 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.181834 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.182102 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.181870 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.182102 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.181905 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.182102 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.181930 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.182102 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.181959 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.182102 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.181990 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-web-config\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.182102 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.182057 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.182102 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.182094 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.182365 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.182126 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-config\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.182365 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.182153 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.282822 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.282782 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.282822 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.282829 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khc4w\" (UniqueName: \"kubernetes.io/projected/aa8c3570-cf87-4d22-976e-06f4436ea317-kube-api-access-khc4w\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.283032 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.282856 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.283032 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.282884 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa8c3570-cf87-4d22-976e-06f4436ea317-config-out\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.283141 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.283113 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.283197 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.283170 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa8c3570-cf87-4d22-976e-06f4436ea317-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.283254 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.283226 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/aa8c3570-cf87-4d22-976e-06f4436ea317-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.283307 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.283261 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.283360 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.283307 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.283360 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.283344 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.283485 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.283404 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.283485 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.283441 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.283583 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.283468 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.283583 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.283543 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.283583 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.283579 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-web-config\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.283725 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.283626 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.283725 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.283653 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.283725 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.283692 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-config\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.284238 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.284148 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.284457 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.284435 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/aa8c3570-cf87-4d22-976e-06f4436ea317-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.285104 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.285025 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.286141 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.285844 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.286969 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.286762 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.287213 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.287074 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa8c3570-cf87-4d22-976e-06f4436ea317-config-out\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.287507 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.287433 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.287507 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.287456 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.287507 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.287468 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-config\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.287507 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.287479 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.288078 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.288039 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.288543 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.288502 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa8c3570-cf87-4d22-976e-06f4436ea317-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.289782 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.289709 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.290364 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.290328 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-web-config\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.290603 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.290569 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.291926 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.291902 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.292652 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.292611 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.306513 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.306452 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khc4w\" (UniqueName: \"kubernetes.io/projected/aa8c3570-cf87-4d22-976e-06f4436ea317-kube-api-access-khc4w\") pod \"prometheus-k8s-0\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.404921 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.404881 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:07.985805 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:07.985773 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:17:07.991673 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:17:07.991641 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa8c3570_cf87_4d22_976e_06f4436ea317.slice/crio-8a1e14bb580071dc1328a91694c74109abba445c1cc387ef7cbe7f13e0330f0c WatchSource:0}: Error finding container 8a1e14bb580071dc1328a91694c74109abba445c1cc387ef7cbe7f13e0330f0c: Status 404 returned error can't find the container with id 8a1e14bb580071dc1328a91694c74109abba445c1cc387ef7cbe7f13e0330f0c Apr 28 19:17:08.539460 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:08.539424 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6874f2b3-c20a-42c9-82f9-355f20028ad6","Type":"ContainerStarted","Data":"19fc28983c61516832632d2f87df499beb2ff23d0948af40c63d9502dec46fcb"} Apr 28 19:17:08.539460 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:08.539466 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6874f2b3-c20a-42c9-82f9-355f20028ad6","Type":"ContainerStarted","Data":"1f96226069969885a6acaddab5fa6a791ef399f3ffd896a3d8cfc05e441a1a24"} Apr 28 19:17:08.539460 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:08.539480 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6874f2b3-c20a-42c9-82f9-355f20028ad6","Type":"ContainerStarted","Data":"ce199ce2a1bf62f3598063dba702b35b46141dfabc0c7f70030fc8a404222967"} Apr 28 19:17:08.539756 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:08.539491 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6874f2b3-c20a-42c9-82f9-355f20028ad6","Type":"ContainerStarted","Data":"daa36d081ee74ef1100499a58e77c2ea2b859d3856d234d078182ebd5e96e1e6"} Apr 28 19:17:08.539756 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:08.539503 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6874f2b3-c20a-42c9-82f9-355f20028ad6","Type":"ContainerStarted","Data":"50b12332a5ea4186b89b6e6dde210c0287cc0ef120391c05848565064a4f3a29"} Apr 28 19:17:08.540681 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:08.540646 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-f68884b84-56nbs" event={"ID":"ae1d9a61-7e7d-4030-a91a-57583b894f03","Type":"ContainerStarted","Data":"8f12351889711939956cb24cbcb5c52949bd4715a05df706184d9d6aea14f32d"} Apr 28 19:17:08.541974 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:08.541950 2539 generic.go:358] "Generic (PLEG): container finished" podID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerID="bc54854cd06b2f07b621339bf1821d0a62a8db08fc5844c2d54301238412df1d" exitCode=0 Apr 28 19:17:08.542097 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:08.541982 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa8c3570-cf87-4d22-976e-06f4436ea317","Type":"ContainerDied","Data":"bc54854cd06b2f07b621339bf1821d0a62a8db08fc5844c2d54301238412df1d"} Apr 28 19:17:08.542097 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:08.542001 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa8c3570-cf87-4d22-976e-06f4436ea317","Type":"ContainerStarted","Data":"8a1e14bb580071dc1328a91694c74109abba445c1cc387ef7cbe7f13e0330f0c"} Apr 28 19:17:08.566649 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:08.566599 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-f68884b84-56nbs" podStartSLOduration=1.253098474 podStartE2EDuration="3.566584132s" podCreationTimestamp="2026-04-28 19:17:05 +0000 UTC" firstStartedPulling="2026-04-28 19:17:05.543694554 +0000 UTC m=+51.840049895" lastFinishedPulling="2026-04-28 19:17:07.857180207 +0000 UTC m=+54.153535553" observedRunningTime="2026-04-28 19:17:08.564616741 +0000 UTC m=+54.860972104" watchObservedRunningTime="2026-04-28 19:17:08.566584132 +0000 UTC m=+54.862939495" Apr 28 19:17:10.552151 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:10.552113 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6874f2b3-c20a-42c9-82f9-355f20028ad6","Type":"ContainerStarted","Data":"3fd34aca12725ed0e60bdbb487ec96d3d383e799963b17cc073a1988866927c1"} Apr 28 19:17:10.581254 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:10.581204 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.622232019 podStartE2EDuration="9.581187038s" podCreationTimestamp="2026-04-28 19:17:01 +0000 UTC" firstStartedPulling="2026-04-28 19:17:02.955364594 +0000 UTC m=+49.251719936" lastFinishedPulling="2026-04-28 19:17:09.914319608 +0000 UTC m=+56.210674955" observedRunningTime="2026-04-28 19:17:10.579637467 +0000 UTC m=+56.875992830" watchObservedRunningTime="2026-04-28 19:17:10.581187038 +0000 UTC m=+56.877542436" Apr 28 19:17:11.516798 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:11.516765 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-d2nhx" Apr 28 19:17:12.465011 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:12.464979 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-977nw" Apr 28 19:17:12.560085 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:12.560051 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa8c3570-cf87-4d22-976e-06f4436ea317","Type":"ContainerStarted","Data":"505d8e97481ec8843be21c48d62f129eae75c22b725e2c08e5aaa0e9ce99f851"} Apr 28 19:17:12.560085 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:12.560085 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa8c3570-cf87-4d22-976e-06f4436ea317","Type":"ContainerStarted","Data":"b74473b964cf41b15c468c5de94ea67286ec65737e8c17801fccadba865a1969"} Apr 28 19:17:14.572789 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:14.572753 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa8c3570-cf87-4d22-976e-06f4436ea317","Type":"ContainerStarted","Data":"a3f3dbf6c396e0635b7853f983fc8a96072c83792d5ff3be8b5338c7c98c89bf"} Apr 28 19:17:14.572789 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:14.572793 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa8c3570-cf87-4d22-976e-06f4436ea317","Type":"ContainerStarted","Data":"a071fd1a6640edf6936fdb7b83d6302d15d0d3e4c5c8d56a61fc5cd54ac8e245"} Apr 28 19:17:15.578960 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:15.578924 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa8c3570-cf87-4d22-976e-06f4436ea317","Type":"ContainerStarted","Data":"29507a64065ff4aeeb94c8754d0f5292a2c0e710fce1e4a9303785cc585b18f0"} Apr 28 19:17:15.578960 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:15.578960 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa8c3570-cf87-4d22-976e-06f4436ea317","Type":"ContainerStarted","Data":"5a4160217ec8c4292b41ffa6bf5065bc492a409ba38f1dfe45771634a7f418ba"} Apr 28 19:17:15.607867 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:15.607824 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.739611923 podStartE2EDuration="8.607809549s" podCreationTimestamp="2026-04-28 19:17:07 +0000 UTC" firstStartedPulling="2026-04-28 19:17:08.543079284 +0000 UTC m=+54.839434626" lastFinishedPulling="2026-04-28 19:17:14.411276907 +0000 UTC m=+60.707632252" observedRunningTime="2026-04-28 19:17:15.606657006 +0000 UTC m=+61.903012371" watchObservedRunningTime="2026-04-28 19:17:15.607809549 +0000 UTC m=+61.904164961" Apr 28 19:17:17.405812 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:17.405764 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:19.501150 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:19.501122 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-85f54f8846-b82cj" Apr 28 19:17:20.003183 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:20.003151 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs\") pod \"network-metrics-daemon-txdd9\" (UID: \"2c344b2c-cf71-45b1-9143-e86be8d1b7b5\") " pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:17:20.005685 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:20.005665 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 28 19:17:20.016007 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:20.015972 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c344b2c-cf71-45b1-9143-e86be8d1b7b5-metrics-certs\") pod \"network-metrics-daemon-txdd9\" (UID: \"2c344b2c-cf71-45b1-9143-e86be8d1b7b5\") " pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:17:20.104601 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:20.104566 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ptnz\" (UniqueName: \"kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz\") pod \"network-check-target-wg74q\" (UID: \"dff9f9ea-63cc-4089-bb7e-e9fcb292c695\") " pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:17:20.106765 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:20.106748 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 28 19:17:20.117329 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:20.117312 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 28 19:17:20.128620 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:20.128599 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ptnz\" (UniqueName: \"kubernetes.io/projected/dff9f9ea-63cc-4089-bb7e-e9fcb292c695-kube-api-access-9ptnz\") pod \"network-check-target-wg74q\" (UID: \"dff9f9ea-63cc-4089-bb7e-e9fcb292c695\") " pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:17:20.283101 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:20.283021 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6mdm8\"" Apr 28 19:17:20.290871 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:20.290844 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-txdd9" Apr 28 19:17:20.298920 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:20.298853 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-n7p66\"" Apr 28 19:17:20.306583 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:20.306548 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:17:20.420993 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:20.420963 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-txdd9"] Apr 28 19:17:20.424404 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:17:20.424358 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c344b2c_cf71_45b1_9143_e86be8d1b7b5.slice/crio-50c31272c6e4e258705bab419ce0975bd528553b0fe09892066cbbf279ee3ee5 WatchSource:0}: Error finding container 50c31272c6e4e258705bab419ce0975bd528553b0fe09892066cbbf279ee3ee5: Status 404 returned error can't find the container with id 50c31272c6e4e258705bab419ce0975bd528553b0fe09892066cbbf279ee3ee5 Apr 28 19:17:20.445066 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:20.445040 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-wg74q"] Apr 28 19:17:20.448396 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:17:20.448345 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddff9f9ea_63cc_4089_bb7e_e9fcb292c695.slice/crio-6e826aacef7dfcf9f88d3114c948fb4cda47ec7fe751569096a169662cb8cf5c WatchSource:0}: Error finding container 6e826aacef7dfcf9f88d3114c948fb4cda47ec7fe751569096a169662cb8cf5c: Status 404 returned error can't find the container with id 6e826aacef7dfcf9f88d3114c948fb4cda47ec7fe751569096a169662cb8cf5c Apr 28 19:17:20.594057 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:20.593965 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-wg74q" event={"ID":"dff9f9ea-63cc-4089-bb7e-e9fcb292c695","Type":"ContainerStarted","Data":"6e826aacef7dfcf9f88d3114c948fb4cda47ec7fe751569096a169662cb8cf5c"} Apr 28 19:17:20.594944 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:20.594921 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-txdd9" event={"ID":"2c344b2c-cf71-45b1-9143-e86be8d1b7b5","Type":"ContainerStarted","Data":"50c31272c6e4e258705bab419ce0975bd528553b0fe09892066cbbf279ee3ee5"} Apr 28 19:17:22.603015 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:22.602977 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-txdd9" event={"ID":"2c344b2c-cf71-45b1-9143-e86be8d1b7b5","Type":"ContainerStarted","Data":"2d85d6fa5b8224866d334eeb9a00299fac25470572002c778bd95027108ddace"} Apr 28 19:17:22.603015 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:22.603018 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-txdd9" event={"ID":"2c344b2c-cf71-45b1-9143-e86be8d1b7b5","Type":"ContainerStarted","Data":"972b1d1d4b54f00249157506e8c66006f8a6db6350357f36062c058b2366d758"} Apr 28 19:17:22.623522 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:22.623465 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-txdd9" podStartSLOduration=67.44396291300001 podStartE2EDuration="1m8.623447374s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="2026-04-28 19:17:20.426224074 +0000 UTC m=+66.722579420" lastFinishedPulling="2026-04-28 19:17:21.605708536 +0000 UTC m=+67.902063881" observedRunningTime="2026-04-28 19:17:22.621027578 +0000 UTC m=+68.917382942" watchObservedRunningTime="2026-04-28 19:17:22.623447374 +0000 UTC m=+68.919802739" Apr 28 19:17:23.606985 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:23.606944 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-wg74q" event={"ID":"dff9f9ea-63cc-4089-bb7e-e9fcb292c695","Type":"ContainerStarted","Data":"093a2065cc434c54ed36fc22af11e550e6abe149346ffb89e3db10e29e4c4d13"} Apr 28 19:17:23.607472 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:23.607065 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:17:23.628097 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:23.628039 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-wg74q" podStartSLOduration=66.897897165 podStartE2EDuration="1m9.628020335s" podCreationTimestamp="2026-04-28 19:16:14 +0000 UTC" firstStartedPulling="2026-04-28 19:17:20.450170502 +0000 UTC m=+66.746525845" lastFinishedPulling="2026-04-28 19:17:23.180293657 +0000 UTC m=+69.476649015" observedRunningTime="2026-04-28 19:17:23.626694774 +0000 UTC m=+69.923050137" watchObservedRunningTime="2026-04-28 19:17:23.628020335 +0000 UTC m=+69.924375699" Apr 28 19:17:25.396969 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:25.396860 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:25.396969 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:25.396941 2539 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:28.078572 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:28.078534 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret\") pod \"global-pull-secret-syncer-b4vxf\" (UID: \"5325db29-356b-4407-92e1-5ad3950aa605\") " pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:17:28.081007 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:28.080988 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 28 19:17:28.091689 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:28.091662 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5325db29-356b-4407-92e1-5ad3950aa605-original-pull-secret\") pod \"global-pull-secret-syncer-b4vxf\" (UID: \"5325db29-356b-4407-92e1-5ad3950aa605\") " pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:17:28.389291 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:28.389211 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b4vxf" Apr 28 19:17:28.505462 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:28.505437 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-b4vxf"] Apr 28 19:17:28.507655 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:17:28.507627 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5325db29_356b_4407_92e1_5ad3950aa605.slice/crio-1d63b49bb43fb7e438ee5661b74c2510f519c528d5d2372b5542ccd878140b09 WatchSource:0}: Error finding container 1d63b49bb43fb7e438ee5661b74c2510f519c528d5d2372b5542ccd878140b09: Status 404 returned error can't find the container with id 1d63b49bb43fb7e438ee5661b74c2510f519c528d5d2372b5542ccd878140b09 Apr 28 19:17:28.624054 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:28.624011 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-b4vxf" event={"ID":"5325db29-356b-4407-92e1-5ad3950aa605","Type":"ContainerStarted","Data":"1d63b49bb43fb7e438ee5661b74c2510f519c528d5d2372b5542ccd878140b09"} Apr 28 19:17:32.637851 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:32.637814 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-b4vxf" event={"ID":"5325db29-356b-4407-92e1-5ad3950aa605","Type":"ContainerStarted","Data":"44db6f0000dc53be14a157920c78dea1810c612b0855b8008116045baa4d0751"} Apr 28 19:17:32.655540 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:32.655483 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-b4vxf" podStartSLOduration=65.076048021 podStartE2EDuration="1m8.655468852s" podCreationTimestamp="2026-04-28 19:16:24 +0000 UTC" firstStartedPulling="2026-04-28 19:17:28.509478768 +0000 UTC m=+74.805834110" lastFinishedPulling="2026-04-28 19:17:32.088899595 +0000 UTC m=+78.385254941" observedRunningTime="2026-04-28 19:17:32.653989288 +0000 UTC m=+78.950344652" watchObservedRunningTime="2026-04-28 19:17:32.655468852 +0000 UTC m=+78.951824215" Apr 28 19:17:45.402394 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:45.402347 2539 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:45.406305 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:45.406285 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-f68884b84-56nbs" Apr 28 19:17:51.878561 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:51.878519 2539 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:51.896923 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:51.896899 2539 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:52.421303 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:52.421277 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:17:54.611934 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:17:54.611905 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-wg74q" Apr 28 19:18:01.220193 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:01.220157 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 28 19:18:01.220890 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:01.220836 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="alertmanager" containerID="cri-o://50b12332a5ea4186b89b6e6dde210c0287cc0ef120391c05848565064a4f3a29" gracePeriod=120 Apr 28 19:18:01.221001 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:01.220903 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="kube-rbac-proxy-metric" containerID="cri-o://19fc28983c61516832632d2f87df499beb2ff23d0948af40c63d9502dec46fcb" gracePeriod=120 Apr 28 19:18:01.221140 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:01.220938 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="kube-rbac-proxy" containerID="cri-o://1f96226069969885a6acaddab5fa6a791ef399f3ffd896a3d8cfc05e441a1a24" gracePeriod=120 Apr 28 19:18:01.221256 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:01.220946 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="config-reloader" containerID="cri-o://daa36d081ee74ef1100499a58e77c2ea2b859d3856d234d078182ebd5e96e1e6" gracePeriod=120 Apr 28 19:18:01.221256 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:01.221106 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="prom-label-proxy" containerID="cri-o://3fd34aca12725ed0e60bdbb487ec96d3d383e799963b17cc073a1988866927c1" gracePeriod=120 Apr 28 19:18:01.221389 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:01.221258 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="kube-rbac-proxy-web" containerID="cri-o://ce199ce2a1bf62f3598063dba702b35b46141dfabc0c7f70030fc8a404222967" gracePeriod=120 Apr 28 19:18:01.718886 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:01.718850 2539 generic.go:358] "Generic (PLEG): container finished" podID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerID="3fd34aca12725ed0e60bdbb487ec96d3d383e799963b17cc073a1988866927c1" exitCode=0 Apr 28 19:18:01.718886 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:01.718876 2539 generic.go:358] "Generic (PLEG): container finished" podID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerID="19fc28983c61516832632d2f87df499beb2ff23d0948af40c63d9502dec46fcb" exitCode=0 Apr 28 19:18:01.718886 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:01.718882 2539 generic.go:358] "Generic (PLEG): container finished" podID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerID="1f96226069969885a6acaddab5fa6a791ef399f3ffd896a3d8cfc05e441a1a24" exitCode=0 Apr 28 19:18:01.718886 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:01.718888 2539 generic.go:358] "Generic (PLEG): container finished" podID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerID="daa36d081ee74ef1100499a58e77c2ea2b859d3856d234d078182ebd5e96e1e6" exitCode=0 Apr 28 19:18:01.719137 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:01.718896 2539 generic.go:358] "Generic (PLEG): container finished" podID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerID="50b12332a5ea4186b89b6e6dde210c0287cc0ef120391c05848565064a4f3a29" exitCode=0 Apr 28 19:18:01.719137 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:01.718919 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6874f2b3-c20a-42c9-82f9-355f20028ad6","Type":"ContainerDied","Data":"3fd34aca12725ed0e60bdbb487ec96d3d383e799963b17cc073a1988866927c1"} Apr 28 19:18:01.719137 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:01.718953 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6874f2b3-c20a-42c9-82f9-355f20028ad6","Type":"ContainerDied","Data":"19fc28983c61516832632d2f87df499beb2ff23d0948af40c63d9502dec46fcb"} Apr 28 19:18:01.719137 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:01.718965 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6874f2b3-c20a-42c9-82f9-355f20028ad6","Type":"ContainerDied","Data":"1f96226069969885a6acaddab5fa6a791ef399f3ffd896a3d8cfc05e441a1a24"} Apr 28 19:18:01.719137 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:01.718974 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6874f2b3-c20a-42c9-82f9-355f20028ad6","Type":"ContainerDied","Data":"daa36d081ee74ef1100499a58e77c2ea2b859d3856d234d078182ebd5e96e1e6"} Apr 28 19:18:01.719137 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:01.718982 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6874f2b3-c20a-42c9-82f9-355f20028ad6","Type":"ContainerDied","Data":"50b12332a5ea4186b89b6e6dde210c0287cc0ef120391c05848565064a4f3a29"} Apr 28 19:18:02.476053 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.476029 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.555298 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.555211 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-main-tls\") pod \"6874f2b3-c20a-42c9-82f9-355f20028ad6\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " Apr 28 19:18:02.555298 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.555242 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz6gg\" (UniqueName: \"kubernetes.io/projected/6874f2b3-c20a-42c9-82f9-355f20028ad6-kube-api-access-zz6gg\") pod \"6874f2b3-c20a-42c9-82f9-355f20028ad6\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " Apr 28 19:18:02.555298 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.555274 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6874f2b3-c20a-42c9-82f9-355f20028ad6-alertmanager-main-db\") pod \"6874f2b3-c20a-42c9-82f9-355f20028ad6\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " Apr 28 19:18:02.555578 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.555399 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6874f2b3-c20a-42c9-82f9-355f20028ad6-tls-assets\") pod \"6874f2b3-c20a-42c9-82f9-355f20028ad6\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " Apr 28 19:18:02.555578 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.555445 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6874f2b3-c20a-42c9-82f9-355f20028ad6-metrics-client-ca\") pod \"6874f2b3-c20a-42c9-82f9-355f20028ad6\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " Apr 28 19:18:02.555578 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.555466 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-cluster-tls-config\") pod \"6874f2b3-c20a-42c9-82f9-355f20028ad6\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " Apr 28 19:18:02.555578 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.555498 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-kube-rbac-proxy-web\") pod \"6874f2b3-c20a-42c9-82f9-355f20028ad6\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " Apr 28 19:18:02.555578 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.555539 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6874f2b3-c20a-42c9-82f9-355f20028ad6-alertmanager-trusted-ca-bundle\") pod \"6874f2b3-c20a-42c9-82f9-355f20028ad6\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " Apr 28 19:18:02.555837 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.555583 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-config-volume\") pod \"6874f2b3-c20a-42c9-82f9-355f20028ad6\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " Apr 28 19:18:02.555837 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.555614 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6874f2b3-c20a-42c9-82f9-355f20028ad6-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "6874f2b3-c20a-42c9-82f9-355f20028ad6" (UID: "6874f2b3-c20a-42c9-82f9-355f20028ad6"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:18:02.555837 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.555623 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-web-config\") pod \"6874f2b3-c20a-42c9-82f9-355f20028ad6\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " Apr 28 19:18:02.555837 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.555686 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"6874f2b3-c20a-42c9-82f9-355f20028ad6\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " Apr 28 19:18:02.555837 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.555720 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6874f2b3-c20a-42c9-82f9-355f20028ad6-config-out\") pod \"6874f2b3-c20a-42c9-82f9-355f20028ad6\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " Apr 28 19:18:02.555837 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.555763 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-kube-rbac-proxy\") pod \"6874f2b3-c20a-42c9-82f9-355f20028ad6\" (UID: \"6874f2b3-c20a-42c9-82f9-355f20028ad6\") " Apr 28 19:18:02.556136 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.556083 2539 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6874f2b3-c20a-42c9-82f9-355f20028ad6-alertmanager-main-db\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:02.556192 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.556172 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6874f2b3-c20a-42c9-82f9-355f20028ad6-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "6874f2b3-c20a-42c9-82f9-355f20028ad6" (UID: "6874f2b3-c20a-42c9-82f9-355f20028ad6"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:18:02.556442 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.556259 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6874f2b3-c20a-42c9-82f9-355f20028ad6-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "6874f2b3-c20a-42c9-82f9-355f20028ad6" (UID: "6874f2b3-c20a-42c9-82f9-355f20028ad6"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:18:02.560023 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.559086 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "6874f2b3-c20a-42c9-82f9-355f20028ad6" (UID: "6874f2b3-c20a-42c9-82f9-355f20028ad6"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:18:02.560023 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.559425 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6874f2b3-c20a-42c9-82f9-355f20028ad6-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "6874f2b3-c20a-42c9-82f9-355f20028ad6" (UID: "6874f2b3-c20a-42c9-82f9-355f20028ad6"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:18:02.562899 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.560257 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "6874f2b3-c20a-42c9-82f9-355f20028ad6" (UID: "6874f2b3-c20a-42c9-82f9-355f20028ad6"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:18:02.562899 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.561856 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-config-volume" (OuterVolumeSpecName: "config-volume") pod "6874f2b3-c20a-42c9-82f9-355f20028ad6" (UID: "6874f2b3-c20a-42c9-82f9-355f20028ad6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:18:02.563713 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.563648 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6874f2b3-c20a-42c9-82f9-355f20028ad6-kube-api-access-zz6gg" (OuterVolumeSpecName: "kube-api-access-zz6gg") pod "6874f2b3-c20a-42c9-82f9-355f20028ad6" (UID: "6874f2b3-c20a-42c9-82f9-355f20028ad6"). InnerVolumeSpecName "kube-api-access-zz6gg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:18:02.564103 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.564037 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "6874f2b3-c20a-42c9-82f9-355f20028ad6" (UID: "6874f2b3-c20a-42c9-82f9-355f20028ad6"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:18:02.564212 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.564149 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "6874f2b3-c20a-42c9-82f9-355f20028ad6" (UID: "6874f2b3-c20a-42c9-82f9-355f20028ad6"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:18:02.565278 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.565248 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6874f2b3-c20a-42c9-82f9-355f20028ad6-config-out" (OuterVolumeSpecName: "config-out") pod "6874f2b3-c20a-42c9-82f9-355f20028ad6" (UID: "6874f2b3-c20a-42c9-82f9-355f20028ad6"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:18:02.567131 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.567104 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "6874f2b3-c20a-42c9-82f9-355f20028ad6" (UID: "6874f2b3-c20a-42c9-82f9-355f20028ad6"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:18:02.572568 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.572541 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-web-config" (OuterVolumeSpecName: "web-config") pod "6874f2b3-c20a-42c9-82f9-355f20028ad6" (UID: "6874f2b3-c20a-42c9-82f9-355f20028ad6"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:18:02.657350 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.657306 2539 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6874f2b3-c20a-42c9-82f9-355f20028ad6-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:02.657350 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.657342 2539 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-config-volume\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:02.657350 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.657353 2539 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-web-config\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:02.657617 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.657363 2539 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:02.657617 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.657394 2539 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6874f2b3-c20a-42c9-82f9-355f20028ad6-config-out\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:02.657617 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.657403 2539 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:02.657617 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.657412 2539 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-main-tls\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:02.657617 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.657422 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zz6gg\" (UniqueName: \"kubernetes.io/projected/6874f2b3-c20a-42c9-82f9-355f20028ad6-kube-api-access-zz6gg\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:02.657617 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.657430 2539 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6874f2b3-c20a-42c9-82f9-355f20028ad6-tls-assets\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:02.657617 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.657439 2539 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6874f2b3-c20a-42c9-82f9-355f20028ad6-metrics-client-ca\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:02.657617 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.657447 2539 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-cluster-tls-config\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:02.657617 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.657456 2539 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6874f2b3-c20a-42c9-82f9-355f20028ad6-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:02.724070 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.724038 2539 generic.go:358] "Generic (PLEG): container finished" podID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerID="ce199ce2a1bf62f3598063dba702b35b46141dfabc0c7f70030fc8a404222967" exitCode=0 Apr 28 19:18:02.724243 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.724096 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6874f2b3-c20a-42c9-82f9-355f20028ad6","Type":"ContainerDied","Data":"ce199ce2a1bf62f3598063dba702b35b46141dfabc0c7f70030fc8a404222967"} Apr 28 19:18:02.724243 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.724120 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6874f2b3-c20a-42c9-82f9-355f20028ad6","Type":"ContainerDied","Data":"205482defc3e30ffa4bf50b524a92a7bc75cb74de69c7c3a3f8bf0672c360f7c"} Apr 28 19:18:02.724243 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.724135 2539 scope.go:117] "RemoveContainer" containerID="3fd34aca12725ed0e60bdbb487ec96d3d383e799963b17cc073a1988866927c1" Apr 28 19:18:02.724243 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.724216 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:02.732013 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.731986 2539 scope.go:117] "RemoveContainer" containerID="19fc28983c61516832632d2f87df499beb2ff23d0948af40c63d9502dec46fcb" Apr 28 19:18:02.738827 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.738808 2539 scope.go:117] "RemoveContainer" containerID="1f96226069969885a6acaddab5fa6a791ef399f3ffd896a3d8cfc05e441a1a24" Apr 28 19:18:02.745084 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.745064 2539 scope.go:117] "RemoveContainer" containerID="ce199ce2a1bf62f3598063dba702b35b46141dfabc0c7f70030fc8a404222967" Apr 28 19:18:02.746597 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.746575 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 28 19:18:02.750451 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.750427 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 28 19:18:02.752756 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.752731 2539 scope.go:117] "RemoveContainer" containerID="daa36d081ee74ef1100499a58e77c2ea2b859d3856d234d078182ebd5e96e1e6" Apr 28 19:18:02.759094 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.759073 2539 scope.go:117] "RemoveContainer" containerID="50b12332a5ea4186b89b6e6dde210c0287cc0ef120391c05848565064a4f3a29" Apr 28 19:18:02.765651 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.765635 2539 scope.go:117] "RemoveContainer" containerID="e6370d7c618000383dfa633e30a64355db73e776be76b775ad7d389ba022b378" Apr 28 19:18:02.772070 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.772051 2539 scope.go:117] "RemoveContainer" containerID="3fd34aca12725ed0e60bdbb487ec96d3d383e799963b17cc073a1988866927c1" Apr 28 19:18:02.772347 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:18:02.772325 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fd34aca12725ed0e60bdbb487ec96d3d383e799963b17cc073a1988866927c1\": container with ID starting with 3fd34aca12725ed0e60bdbb487ec96d3d383e799963b17cc073a1988866927c1 not found: ID does not exist" containerID="3fd34aca12725ed0e60bdbb487ec96d3d383e799963b17cc073a1988866927c1" Apr 28 19:18:02.772440 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.772355 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fd34aca12725ed0e60bdbb487ec96d3d383e799963b17cc073a1988866927c1"} err="failed to get container status \"3fd34aca12725ed0e60bdbb487ec96d3d383e799963b17cc073a1988866927c1\": rpc error: code = NotFound desc = could not find container \"3fd34aca12725ed0e60bdbb487ec96d3d383e799963b17cc073a1988866927c1\": container with ID starting with 3fd34aca12725ed0e60bdbb487ec96d3d383e799963b17cc073a1988866927c1 not found: ID does not exist" Apr 28 19:18:02.772440 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.772405 2539 scope.go:117] "RemoveContainer" containerID="19fc28983c61516832632d2f87df499beb2ff23d0948af40c63d9502dec46fcb" Apr 28 19:18:02.772634 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:18:02.772617 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19fc28983c61516832632d2f87df499beb2ff23d0948af40c63d9502dec46fcb\": container with ID starting with 19fc28983c61516832632d2f87df499beb2ff23d0948af40c63d9502dec46fcb not found: ID does not exist" containerID="19fc28983c61516832632d2f87df499beb2ff23d0948af40c63d9502dec46fcb" Apr 28 19:18:02.772673 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.772640 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19fc28983c61516832632d2f87df499beb2ff23d0948af40c63d9502dec46fcb"} err="failed to get container status \"19fc28983c61516832632d2f87df499beb2ff23d0948af40c63d9502dec46fcb\": rpc error: code = NotFound desc = could not find container \"19fc28983c61516832632d2f87df499beb2ff23d0948af40c63d9502dec46fcb\": container with ID starting with 19fc28983c61516832632d2f87df499beb2ff23d0948af40c63d9502dec46fcb not found: ID does not exist" Apr 28 19:18:02.772673 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.772663 2539 scope.go:117] "RemoveContainer" containerID="1f96226069969885a6acaddab5fa6a791ef399f3ffd896a3d8cfc05e441a1a24" Apr 28 19:18:02.772889 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:18:02.772874 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f96226069969885a6acaddab5fa6a791ef399f3ffd896a3d8cfc05e441a1a24\": container with ID starting with 1f96226069969885a6acaddab5fa6a791ef399f3ffd896a3d8cfc05e441a1a24 not found: ID does not exist" containerID="1f96226069969885a6acaddab5fa6a791ef399f3ffd896a3d8cfc05e441a1a24" Apr 28 19:18:02.772946 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.772893 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f96226069969885a6acaddab5fa6a791ef399f3ffd896a3d8cfc05e441a1a24"} err="failed to get container status \"1f96226069969885a6acaddab5fa6a791ef399f3ffd896a3d8cfc05e441a1a24\": rpc error: code = NotFound desc = could not find container \"1f96226069969885a6acaddab5fa6a791ef399f3ffd896a3d8cfc05e441a1a24\": container with ID starting with 1f96226069969885a6acaddab5fa6a791ef399f3ffd896a3d8cfc05e441a1a24 not found: ID does not exist" Apr 28 19:18:02.772946 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.772909 2539 scope.go:117] "RemoveContainer" containerID="ce199ce2a1bf62f3598063dba702b35b46141dfabc0c7f70030fc8a404222967" Apr 28 19:18:02.773170 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:18:02.773155 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce199ce2a1bf62f3598063dba702b35b46141dfabc0c7f70030fc8a404222967\": container with ID starting with ce199ce2a1bf62f3598063dba702b35b46141dfabc0c7f70030fc8a404222967 not found: ID does not exist" containerID="ce199ce2a1bf62f3598063dba702b35b46141dfabc0c7f70030fc8a404222967" Apr 28 19:18:02.773216 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.773173 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce199ce2a1bf62f3598063dba702b35b46141dfabc0c7f70030fc8a404222967"} err="failed to get container status \"ce199ce2a1bf62f3598063dba702b35b46141dfabc0c7f70030fc8a404222967\": rpc error: code = NotFound desc = could not find container \"ce199ce2a1bf62f3598063dba702b35b46141dfabc0c7f70030fc8a404222967\": container with ID starting with ce199ce2a1bf62f3598063dba702b35b46141dfabc0c7f70030fc8a404222967 not found: ID does not exist" Apr 28 19:18:02.773216 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.773185 2539 scope.go:117] "RemoveContainer" containerID="daa36d081ee74ef1100499a58e77c2ea2b859d3856d234d078182ebd5e96e1e6" Apr 28 19:18:02.773422 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:18:02.773408 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daa36d081ee74ef1100499a58e77c2ea2b859d3856d234d078182ebd5e96e1e6\": container with ID starting with daa36d081ee74ef1100499a58e77c2ea2b859d3856d234d078182ebd5e96e1e6 not found: ID does not exist" containerID="daa36d081ee74ef1100499a58e77c2ea2b859d3856d234d078182ebd5e96e1e6" Apr 28 19:18:02.773476 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.773424 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daa36d081ee74ef1100499a58e77c2ea2b859d3856d234d078182ebd5e96e1e6"} err="failed to get container status \"daa36d081ee74ef1100499a58e77c2ea2b859d3856d234d078182ebd5e96e1e6\": rpc error: code = NotFound desc = could not find container \"daa36d081ee74ef1100499a58e77c2ea2b859d3856d234d078182ebd5e96e1e6\": container with ID starting with daa36d081ee74ef1100499a58e77c2ea2b859d3856d234d078182ebd5e96e1e6 not found: ID does not exist" Apr 28 19:18:02.773476 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.773434 2539 scope.go:117] "RemoveContainer" containerID="50b12332a5ea4186b89b6e6dde210c0287cc0ef120391c05848565064a4f3a29" Apr 28 19:18:02.773638 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:18:02.773618 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50b12332a5ea4186b89b6e6dde210c0287cc0ef120391c05848565064a4f3a29\": container with ID starting with 50b12332a5ea4186b89b6e6dde210c0287cc0ef120391c05848565064a4f3a29 not found: ID does not exist" containerID="50b12332a5ea4186b89b6e6dde210c0287cc0ef120391c05848565064a4f3a29" Apr 28 19:18:02.773677 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.773646 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50b12332a5ea4186b89b6e6dde210c0287cc0ef120391c05848565064a4f3a29"} err="failed to get container status \"50b12332a5ea4186b89b6e6dde210c0287cc0ef120391c05848565064a4f3a29\": rpc error: code = NotFound desc = could not find container \"50b12332a5ea4186b89b6e6dde210c0287cc0ef120391c05848565064a4f3a29\": container with ID starting with 50b12332a5ea4186b89b6e6dde210c0287cc0ef120391c05848565064a4f3a29 not found: ID does not exist" Apr 28 19:18:02.773677 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.773660 2539 scope.go:117] "RemoveContainer" containerID="e6370d7c618000383dfa633e30a64355db73e776be76b775ad7d389ba022b378" Apr 28 19:18:02.773871 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:18:02.773857 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6370d7c618000383dfa633e30a64355db73e776be76b775ad7d389ba022b378\": container with ID starting with e6370d7c618000383dfa633e30a64355db73e776be76b775ad7d389ba022b378 not found: ID does not exist" containerID="e6370d7c618000383dfa633e30a64355db73e776be76b775ad7d389ba022b378" Apr 28 19:18:02.773924 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:02.773873 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6370d7c618000383dfa633e30a64355db73e776be76b775ad7d389ba022b378"} err="failed to get container status \"e6370d7c618000383dfa633e30a64355db73e776be76b775ad7d389ba022b378\": rpc error: code = NotFound desc = could not find container \"e6370d7c618000383dfa633e30a64355db73e776be76b775ad7d389ba022b378\": container with ID starting with e6370d7c618000383dfa633e30a64355db73e776be76b775ad7d389ba022b378 not found: ID does not exist" Apr 28 19:18:04.270500 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:04.270466 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" path="/var/lib/kubelet/pods/6874f2b3-c20a-42c9-82f9-355f20028ad6/volumes" Apr 28 19:18:05.461151 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:05.461113 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:18:05.461751 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:05.461720 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="prometheus" containerID="cri-o://b74473b964cf41b15c468c5de94ea67286ec65737e8c17801fccadba865a1969" gracePeriod=600 Apr 28 19:18:05.461857 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:05.461793 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="kube-rbac-proxy-thanos" containerID="cri-o://29507a64065ff4aeeb94c8754d0f5292a2c0e710fce1e4a9303785cc585b18f0" gracePeriod=600 Apr 28 19:18:05.461857 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:05.461823 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="kube-rbac-proxy-web" containerID="cri-o://a3f3dbf6c396e0635b7853f983fc8a96072c83792d5ff3be8b5338c7c98c89bf" gracePeriod=600 Apr 28 19:18:05.461968 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:05.461805 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="thanos-sidecar" containerID="cri-o://a071fd1a6640edf6936fdb7b83d6302d15d0d3e4c5c8d56a61fc5cd54ac8e245" gracePeriod=600 Apr 28 19:18:05.461968 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:05.461880 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="config-reloader" containerID="cri-o://505d8e97481ec8843be21c48d62f129eae75c22b725e2c08e5aaa0e9ce99f851" gracePeriod=600 Apr 28 19:18:05.462145 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:05.462109 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="kube-rbac-proxy" containerID="cri-o://5a4160217ec8c4292b41ffa6bf5065bc492a409ba38f1dfe45771634a7f418ba" gracePeriod=600 Apr 28 19:18:05.736508 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:05.736419 2539 generic.go:358] "Generic (PLEG): container finished" podID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerID="29507a64065ff4aeeb94c8754d0f5292a2c0e710fce1e4a9303785cc585b18f0" exitCode=0 Apr 28 19:18:05.736508 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:05.736443 2539 generic.go:358] "Generic (PLEG): container finished" podID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerID="5a4160217ec8c4292b41ffa6bf5065bc492a409ba38f1dfe45771634a7f418ba" exitCode=0 Apr 28 19:18:05.736508 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:05.736451 2539 generic.go:358] "Generic (PLEG): container finished" podID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerID="a071fd1a6640edf6936fdb7b83d6302d15d0d3e4c5c8d56a61fc5cd54ac8e245" exitCode=0 Apr 28 19:18:05.736508 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:05.736457 2539 generic.go:358] "Generic (PLEG): container finished" podID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerID="505d8e97481ec8843be21c48d62f129eae75c22b725e2c08e5aaa0e9ce99f851" exitCode=0 Apr 28 19:18:05.736508 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:05.736462 2539 generic.go:358] "Generic (PLEG): container finished" podID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerID="b74473b964cf41b15c468c5de94ea67286ec65737e8c17801fccadba865a1969" exitCode=0 Apr 28 19:18:05.736508 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:05.736493 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa8c3570-cf87-4d22-976e-06f4436ea317","Type":"ContainerDied","Data":"29507a64065ff4aeeb94c8754d0f5292a2c0e710fce1e4a9303785cc585b18f0"} Apr 28 19:18:05.736824 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:05.736529 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa8c3570-cf87-4d22-976e-06f4436ea317","Type":"ContainerDied","Data":"5a4160217ec8c4292b41ffa6bf5065bc492a409ba38f1dfe45771634a7f418ba"} Apr 28 19:18:05.736824 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:05.736540 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa8c3570-cf87-4d22-976e-06f4436ea317","Type":"ContainerDied","Data":"a071fd1a6640edf6936fdb7b83d6302d15d0d3e4c5c8d56a61fc5cd54ac8e245"} Apr 28 19:18:05.736824 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:05.736550 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa8c3570-cf87-4d22-976e-06f4436ea317","Type":"ContainerDied","Data":"505d8e97481ec8843be21c48d62f129eae75c22b725e2c08e5aaa0e9ce99f851"} Apr 28 19:18:05.736824 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:05.736559 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa8c3570-cf87-4d22-976e-06f4436ea317","Type":"ContainerDied","Data":"b74473b964cf41b15c468c5de94ea67286ec65737e8c17801fccadba865a1969"} Apr 28 19:18:06.744259 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.744233 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:06.753186 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.753149 2539 generic.go:358] "Generic (PLEG): container finished" podID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerID="a3f3dbf6c396e0635b7853f983fc8a96072c83792d5ff3be8b5338c7c98c89bf" exitCode=0 Apr 28 19:18:06.753322 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.753197 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa8c3570-cf87-4d22-976e-06f4436ea317","Type":"ContainerDied","Data":"a3f3dbf6c396e0635b7853f983fc8a96072c83792d5ff3be8b5338c7c98c89bf"} Apr 28 19:18:06.753322 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.753230 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa8c3570-cf87-4d22-976e-06f4436ea317","Type":"ContainerDied","Data":"8a1e14bb580071dc1328a91694c74109abba445c1cc387ef7cbe7f13e0330f0c"} Apr 28 19:18:06.753322 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.753251 2539 scope.go:117] "RemoveContainer" containerID="29507a64065ff4aeeb94c8754d0f5292a2c0e710fce1e4a9303785cc585b18f0" Apr 28 19:18:06.753512 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.753328 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:06.760135 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.760115 2539 scope.go:117] "RemoveContainer" containerID="5a4160217ec8c4292b41ffa6bf5065bc492a409ba38f1dfe45771634a7f418ba" Apr 28 19:18:06.767152 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.767107 2539 scope.go:117] "RemoveContainer" containerID="a3f3dbf6c396e0635b7853f983fc8a96072c83792d5ff3be8b5338c7c98c89bf" Apr 28 19:18:06.774283 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.774256 2539 scope.go:117] "RemoveContainer" containerID="a071fd1a6640edf6936fdb7b83d6302d15d0d3e4c5c8d56a61fc5cd54ac8e245" Apr 28 19:18:06.783252 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.783223 2539 scope.go:117] "RemoveContainer" containerID="505d8e97481ec8843be21c48d62f129eae75c22b725e2c08e5aaa0e9ce99f851" Apr 28 19:18:06.801089 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.801065 2539 scope.go:117] "RemoveContainer" containerID="b74473b964cf41b15c468c5de94ea67286ec65737e8c17801fccadba865a1969" Apr 28 19:18:06.809697 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.809669 2539 scope.go:117] "RemoveContainer" containerID="bc54854cd06b2f07b621339bf1821d0a62a8db08fc5844c2d54301238412df1d" Apr 28 19:18:06.816446 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.816424 2539 scope.go:117] "RemoveContainer" containerID="29507a64065ff4aeeb94c8754d0f5292a2c0e710fce1e4a9303785cc585b18f0" Apr 28 19:18:06.816767 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:18:06.816739 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29507a64065ff4aeeb94c8754d0f5292a2c0e710fce1e4a9303785cc585b18f0\": container with ID starting with 29507a64065ff4aeeb94c8754d0f5292a2c0e710fce1e4a9303785cc585b18f0 not found: ID does not exist" containerID="29507a64065ff4aeeb94c8754d0f5292a2c0e710fce1e4a9303785cc585b18f0" Apr 28 19:18:06.816868 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.816782 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29507a64065ff4aeeb94c8754d0f5292a2c0e710fce1e4a9303785cc585b18f0"} err="failed to get container status \"29507a64065ff4aeeb94c8754d0f5292a2c0e710fce1e4a9303785cc585b18f0\": rpc error: code = NotFound desc = could not find container \"29507a64065ff4aeeb94c8754d0f5292a2c0e710fce1e4a9303785cc585b18f0\": container with ID starting with 29507a64065ff4aeeb94c8754d0f5292a2c0e710fce1e4a9303785cc585b18f0 not found: ID does not exist" Apr 28 19:18:06.816868 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.816809 2539 scope.go:117] "RemoveContainer" containerID="5a4160217ec8c4292b41ffa6bf5065bc492a409ba38f1dfe45771634a7f418ba" Apr 28 19:18:06.817118 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:18:06.817100 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a4160217ec8c4292b41ffa6bf5065bc492a409ba38f1dfe45771634a7f418ba\": container with ID starting with 5a4160217ec8c4292b41ffa6bf5065bc492a409ba38f1dfe45771634a7f418ba not found: ID does not exist" containerID="5a4160217ec8c4292b41ffa6bf5065bc492a409ba38f1dfe45771634a7f418ba" Apr 28 19:18:06.817165 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.817126 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a4160217ec8c4292b41ffa6bf5065bc492a409ba38f1dfe45771634a7f418ba"} err="failed to get container status \"5a4160217ec8c4292b41ffa6bf5065bc492a409ba38f1dfe45771634a7f418ba\": rpc error: code = NotFound desc = could not find container \"5a4160217ec8c4292b41ffa6bf5065bc492a409ba38f1dfe45771634a7f418ba\": container with ID starting with 5a4160217ec8c4292b41ffa6bf5065bc492a409ba38f1dfe45771634a7f418ba not found: ID does not exist" Apr 28 19:18:06.817165 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.817144 2539 scope.go:117] "RemoveContainer" containerID="a3f3dbf6c396e0635b7853f983fc8a96072c83792d5ff3be8b5338c7c98c89bf" Apr 28 19:18:06.817422 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:18:06.817398 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3f3dbf6c396e0635b7853f983fc8a96072c83792d5ff3be8b5338c7c98c89bf\": container with ID starting with a3f3dbf6c396e0635b7853f983fc8a96072c83792d5ff3be8b5338c7c98c89bf not found: ID does not exist" containerID="a3f3dbf6c396e0635b7853f983fc8a96072c83792d5ff3be8b5338c7c98c89bf" Apr 28 19:18:06.817501 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.817431 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3f3dbf6c396e0635b7853f983fc8a96072c83792d5ff3be8b5338c7c98c89bf"} err="failed to get container status \"a3f3dbf6c396e0635b7853f983fc8a96072c83792d5ff3be8b5338c7c98c89bf\": rpc error: code = NotFound desc = could not find container \"a3f3dbf6c396e0635b7853f983fc8a96072c83792d5ff3be8b5338c7c98c89bf\": container with ID starting with a3f3dbf6c396e0635b7853f983fc8a96072c83792d5ff3be8b5338c7c98c89bf not found: ID does not exist" Apr 28 19:18:06.817501 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.817456 2539 scope.go:117] "RemoveContainer" containerID="a071fd1a6640edf6936fdb7b83d6302d15d0d3e4c5c8d56a61fc5cd54ac8e245" Apr 28 19:18:06.817748 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:18:06.817718 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a071fd1a6640edf6936fdb7b83d6302d15d0d3e4c5c8d56a61fc5cd54ac8e245\": container with ID starting with a071fd1a6640edf6936fdb7b83d6302d15d0d3e4c5c8d56a61fc5cd54ac8e245 not found: ID does not exist" containerID="a071fd1a6640edf6936fdb7b83d6302d15d0d3e4c5c8d56a61fc5cd54ac8e245" Apr 28 19:18:06.817833 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.817756 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a071fd1a6640edf6936fdb7b83d6302d15d0d3e4c5c8d56a61fc5cd54ac8e245"} err="failed to get container status \"a071fd1a6640edf6936fdb7b83d6302d15d0d3e4c5c8d56a61fc5cd54ac8e245\": rpc error: code = NotFound desc = could not find container \"a071fd1a6640edf6936fdb7b83d6302d15d0d3e4c5c8d56a61fc5cd54ac8e245\": container with ID starting with a071fd1a6640edf6936fdb7b83d6302d15d0d3e4c5c8d56a61fc5cd54ac8e245 not found: ID does not exist" Apr 28 19:18:06.817833 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.817780 2539 scope.go:117] "RemoveContainer" containerID="505d8e97481ec8843be21c48d62f129eae75c22b725e2c08e5aaa0e9ce99f851" Apr 28 19:18:06.818111 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:18:06.818078 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"505d8e97481ec8843be21c48d62f129eae75c22b725e2c08e5aaa0e9ce99f851\": container with ID starting with 505d8e97481ec8843be21c48d62f129eae75c22b725e2c08e5aaa0e9ce99f851 not found: ID does not exist" containerID="505d8e97481ec8843be21c48d62f129eae75c22b725e2c08e5aaa0e9ce99f851" Apr 28 19:18:06.818237 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.818104 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"505d8e97481ec8843be21c48d62f129eae75c22b725e2c08e5aaa0e9ce99f851"} err="failed to get container status \"505d8e97481ec8843be21c48d62f129eae75c22b725e2c08e5aaa0e9ce99f851\": rpc error: code = NotFound desc = could not find container \"505d8e97481ec8843be21c48d62f129eae75c22b725e2c08e5aaa0e9ce99f851\": container with ID starting with 505d8e97481ec8843be21c48d62f129eae75c22b725e2c08e5aaa0e9ce99f851 not found: ID does not exist" Apr 28 19:18:06.818237 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.818126 2539 scope.go:117] "RemoveContainer" containerID="b74473b964cf41b15c468c5de94ea67286ec65737e8c17801fccadba865a1969" Apr 28 19:18:06.818448 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:18:06.818399 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b74473b964cf41b15c468c5de94ea67286ec65737e8c17801fccadba865a1969\": container with ID starting with b74473b964cf41b15c468c5de94ea67286ec65737e8c17801fccadba865a1969 not found: ID does not exist" containerID="b74473b964cf41b15c468c5de94ea67286ec65737e8c17801fccadba865a1969" Apr 28 19:18:06.818448 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.818427 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b74473b964cf41b15c468c5de94ea67286ec65737e8c17801fccadba865a1969"} err="failed to get container status \"b74473b964cf41b15c468c5de94ea67286ec65737e8c17801fccadba865a1969\": rpc error: code = NotFound desc = could not find container \"b74473b964cf41b15c468c5de94ea67286ec65737e8c17801fccadba865a1969\": container with ID starting with b74473b964cf41b15c468c5de94ea67286ec65737e8c17801fccadba865a1969 not found: ID does not exist" Apr 28 19:18:06.818448 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.818447 2539 scope.go:117] "RemoveContainer" containerID="bc54854cd06b2f07b621339bf1821d0a62a8db08fc5844c2d54301238412df1d" Apr 28 19:18:06.818707 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:18:06.818689 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc54854cd06b2f07b621339bf1821d0a62a8db08fc5844c2d54301238412df1d\": container with ID starting with bc54854cd06b2f07b621339bf1821d0a62a8db08fc5844c2d54301238412df1d not found: ID does not exist" containerID="bc54854cd06b2f07b621339bf1821d0a62a8db08fc5844c2d54301238412df1d" Apr 28 19:18:06.818783 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.818713 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc54854cd06b2f07b621339bf1821d0a62a8db08fc5844c2d54301238412df1d"} err="failed to get container status \"bc54854cd06b2f07b621339bf1821d0a62a8db08fc5844c2d54301238412df1d\": rpc error: code = NotFound desc = could not find container \"bc54854cd06b2f07b621339bf1821d0a62a8db08fc5844c2d54301238412df1d\": container with ID starting with bc54854cd06b2f07b621339bf1821d0a62a8db08fc5844c2d54301238412df1d not found: ID does not exist" Apr 28 19:18:06.891668 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.891636 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-prometheus-trusted-ca-bundle\") pod \"aa8c3570-cf87-4d22-976e-06f4436ea317\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " Apr 28 19:18:06.891845 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.891676 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa8c3570-cf87-4d22-976e-06f4436ea317-tls-assets\") pod \"aa8c3570-cf87-4d22-976e-06f4436ea317\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " Apr 28 19:18:06.891845 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.891702 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-web-config\") pod \"aa8c3570-cf87-4d22-976e-06f4436ea317\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " Apr 28 19:18:06.891845 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.891725 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"aa8c3570-cf87-4d22-976e-06f4436ea317\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " Apr 28 19:18:06.891845 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.891743 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa8c3570-cf87-4d22-976e-06f4436ea317-config-out\") pod \"aa8c3570-cf87-4d22-976e-06f4436ea317\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " Apr 28 19:18:06.891845 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.891764 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/aa8c3570-cf87-4d22-976e-06f4436ea317-prometheus-k8s-db\") pod \"aa8c3570-cf87-4d22-976e-06f4436ea317\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " Apr 28 19:18:06.891845 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.891786 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-configmap-metrics-client-ca\") pod \"aa8c3570-cf87-4d22-976e-06f4436ea317\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " Apr 28 19:18:06.891845 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.891813 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"aa8c3570-cf87-4d22-976e-06f4436ea317\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " Apr 28 19:18:06.891845 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.891839 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-kube-rbac-proxy\") pod \"aa8c3570-cf87-4d22-976e-06f4436ea317\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " Apr 28 19:18:06.892253 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.891867 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-metrics-client-certs\") pod \"aa8c3570-cf87-4d22-976e-06f4436ea317\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " Apr 28 19:18:06.892253 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.891894 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-config\") pod \"aa8c3570-cf87-4d22-976e-06f4436ea317\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " Apr 28 19:18:06.892253 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.891928 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-thanos-prometheus-http-client-file\") pod \"aa8c3570-cf87-4d22-976e-06f4436ea317\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " Apr 28 19:18:06.892253 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.891958 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-configmap-kubelet-serving-ca-bundle\") pod \"aa8c3570-cf87-4d22-976e-06f4436ea317\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " Apr 28 19:18:06.892253 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.891985 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-prometheus-k8s-rulefiles-0\") pod \"aa8c3570-cf87-4d22-976e-06f4436ea317\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " Apr 28 19:18:06.892253 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.892009 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-configmap-serving-certs-ca-bundle\") pod \"aa8c3570-cf87-4d22-976e-06f4436ea317\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " Apr 28 19:18:06.892253 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.892049 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-prometheus-k8s-tls\") pod \"aa8c3570-cf87-4d22-976e-06f4436ea317\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " Apr 28 19:18:06.892253 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.892074 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-grpc-tls\") pod \"aa8c3570-cf87-4d22-976e-06f4436ea317\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " Apr 28 19:18:06.892253 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.892106 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khc4w\" (UniqueName: \"kubernetes.io/projected/aa8c3570-cf87-4d22-976e-06f4436ea317-kube-api-access-khc4w\") pod \"aa8c3570-cf87-4d22-976e-06f4436ea317\" (UID: \"aa8c3570-cf87-4d22-976e-06f4436ea317\") " Apr 28 19:18:06.894199 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.892065 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "aa8c3570-cf87-4d22-976e-06f4436ea317" (UID: "aa8c3570-cf87-4d22-976e-06f4436ea317"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:18:06.894199 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.892688 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa8c3570-cf87-4d22-976e-06f4436ea317-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "aa8c3570-cf87-4d22-976e-06f4436ea317" (UID: "aa8c3570-cf87-4d22-976e-06f4436ea317"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:18:06.894199 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.893472 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "aa8c3570-cf87-4d22-976e-06f4436ea317" (UID: "aa8c3570-cf87-4d22-976e-06f4436ea317"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:18:06.894199 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.893745 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "aa8c3570-cf87-4d22-976e-06f4436ea317" (UID: "aa8c3570-cf87-4d22-976e-06f4436ea317"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:18:06.894689 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.894664 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa8c3570-cf87-4d22-976e-06f4436ea317-config-out" (OuterVolumeSpecName: "config-out") pod "aa8c3570-cf87-4d22-976e-06f4436ea317" (UID: "aa8c3570-cf87-4d22-976e-06f4436ea317"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:18:06.894884 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.894829 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "aa8c3570-cf87-4d22-976e-06f4436ea317" (UID: "aa8c3570-cf87-4d22-976e-06f4436ea317"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:18:06.895003 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.894987 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "aa8c3570-cf87-4d22-976e-06f4436ea317" (UID: "aa8c3570-cf87-4d22-976e-06f4436ea317"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:18:06.896180 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.896145 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "aa8c3570-cf87-4d22-976e-06f4436ea317" (UID: "aa8c3570-cf87-4d22-976e-06f4436ea317"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:18:06.896280 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.896199 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa8c3570-cf87-4d22-976e-06f4436ea317-kube-api-access-khc4w" (OuterVolumeSpecName: "kube-api-access-khc4w") pod "aa8c3570-cf87-4d22-976e-06f4436ea317" (UID: "aa8c3570-cf87-4d22-976e-06f4436ea317"). InnerVolumeSpecName "kube-api-access-khc4w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:18:06.896732 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.896710 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "aa8c3570-cf87-4d22-976e-06f4436ea317" (UID: "aa8c3570-cf87-4d22-976e-06f4436ea317"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:18:06.897190 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.897151 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa8c3570-cf87-4d22-976e-06f4436ea317-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "aa8c3570-cf87-4d22-976e-06f4436ea317" (UID: "aa8c3570-cf87-4d22-976e-06f4436ea317"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:18:06.897292 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.897185 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "aa8c3570-cf87-4d22-976e-06f4436ea317" (UID: "aa8c3570-cf87-4d22-976e-06f4436ea317"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:18:06.897386 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.897348 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "aa8c3570-cf87-4d22-976e-06f4436ea317" (UID: "aa8c3570-cf87-4d22-976e-06f4436ea317"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:18:06.897764 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.897741 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "aa8c3570-cf87-4d22-976e-06f4436ea317" (UID: "aa8c3570-cf87-4d22-976e-06f4436ea317"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:18:06.898092 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.898060 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-config" (OuterVolumeSpecName: "config") pod "aa8c3570-cf87-4d22-976e-06f4436ea317" (UID: "aa8c3570-cf87-4d22-976e-06f4436ea317"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:18:06.898341 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.898311 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "aa8c3570-cf87-4d22-976e-06f4436ea317" (UID: "aa8c3570-cf87-4d22-976e-06f4436ea317"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:18:06.898774 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.898755 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "aa8c3570-cf87-4d22-976e-06f4436ea317" (UID: "aa8c3570-cf87-4d22-976e-06f4436ea317"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:18:06.906157 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.906131 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-web-config" (OuterVolumeSpecName: "web-config") pod "aa8c3570-cf87-4d22-976e-06f4436ea317" (UID: "aa8c3570-cf87-4d22-976e-06f4436ea317"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:18:06.993099 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.993054 2539 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-web-config\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:06.993099 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.993095 2539 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:06.993286 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.993111 2539 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa8c3570-cf87-4d22-976e-06f4436ea317-config-out\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:06.993286 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.993127 2539 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/aa8c3570-cf87-4d22-976e-06f4436ea317-prometheus-k8s-db\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:06.993286 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.993142 2539 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-configmap-metrics-client-ca\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:06.993286 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.993160 2539 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:06.993286 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.993175 2539 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-kube-rbac-proxy\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:06.993286 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.993192 2539 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-metrics-client-certs\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:06.993286 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.993206 2539 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-config\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:06.993286 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.993221 2539 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-thanos-prometheus-http-client-file\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:06.993286 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.993235 2539 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:06.993286 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.993250 2539 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:06.993286 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.993265 2539 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:06.993286 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.993280 2539 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-prometheus-k8s-tls\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:06.993631 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.993294 2539 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aa8c3570-cf87-4d22-976e-06f4436ea317-secret-grpc-tls\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:06.993631 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.993311 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-khc4w\" (UniqueName: \"kubernetes.io/projected/aa8c3570-cf87-4d22-976e-06f4436ea317-kube-api-access-khc4w\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:06.993631 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.993324 2539 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa8c3570-cf87-4d22-976e-06f4436ea317-prometheus-trusted-ca-bundle\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:06.993631 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:06.993336 2539 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa8c3570-cf87-4d22-976e-06f4436ea317-tls-assets\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:18:07.081616 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.081587 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:18:07.092430 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.092392 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:18:07.127870 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.127833 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:18:07.128076 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128064 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="kube-rbac-proxy-web" Apr 28 19:18:07.128114 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128077 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="kube-rbac-proxy-web" Apr 28 19:18:07.128114 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128086 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="prom-label-proxy" Apr 28 19:18:07.128114 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128092 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="prom-label-proxy" Apr 28 19:18:07.128114 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128103 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="init-config-reloader" Apr 28 19:18:07.128114 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128110 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="init-config-reloader" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128117 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="kube-rbac-proxy-web" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128123 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="kube-rbac-proxy-web" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128130 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="kube-rbac-proxy-metric" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128135 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="kube-rbac-proxy-metric" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128141 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="thanos-sidecar" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128146 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="thanos-sidecar" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128153 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="prometheus" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128159 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="prometheus" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128164 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="config-reloader" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128169 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="config-reloader" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128176 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="kube-rbac-proxy" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128181 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="kube-rbac-proxy" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128186 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="init-config-reloader" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128191 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="init-config-reloader" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128198 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="kube-rbac-proxy" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128204 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="kube-rbac-proxy" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128209 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="kube-rbac-proxy-thanos" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128214 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="kube-rbac-proxy-thanos" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128221 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="config-reloader" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128226 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="config-reloader" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128233 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="alertmanager" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128238 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="alertmanager" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128272 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="thanos-sidecar" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128280 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="prom-label-proxy" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128286 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="kube-rbac-proxy-thanos" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128293 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="prometheus" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128299 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="alertmanager" Apr 28 19:18:07.128296 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128305 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="kube-rbac-proxy-metric" Apr 28 19:18:07.129090 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128311 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="config-reloader" Apr 28 19:18:07.129090 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128317 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="kube-rbac-proxy" Apr 28 19:18:07.129090 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128322 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="config-reloader" Apr 28 19:18:07.129090 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128327 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="6874f2b3-c20a-42c9-82f9-355f20028ad6" containerName="kube-rbac-proxy-web" Apr 28 19:18:07.129090 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128332 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="kube-rbac-proxy-web" Apr 28 19:18:07.129090 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.128338 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" containerName="kube-rbac-proxy" Apr 28 19:18:07.131807 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.131778 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.136458 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.136427 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 28 19:18:07.136580 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.136473 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-lw4dl\"" Apr 28 19:18:07.136694 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.136673 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 28 19:18:07.136779 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.136759 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 28 19:18:07.137094 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.137072 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 28 19:18:07.137165 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.137146 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 28 19:18:07.137341 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.137320 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 28 19:18:07.137563 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.137544 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 28 19:18:07.137622 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.137590 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 28 19:18:07.137699 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.137078 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3vmqc2tmsh820\"" Apr 28 19:18:07.139067 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.139037 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 28 19:18:07.140692 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.140670 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 28 19:18:07.141599 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.141555 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 28 19:18:07.148143 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.148120 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 28 19:18:07.150682 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.150661 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:18:07.195040 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.194998 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.195208 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.195059 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b64d7f5a-30af-486f-af17-7e12f3783d7d-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.195208 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.195099 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b64d7f5a-30af-486f-af17-7e12f3783d7d-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.195208 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.195131 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b64d7f5a-30af-486f-af17-7e12f3783d7d-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.195208 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.195202 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.195361 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.195247 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.195361 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.195283 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.195361 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.195314 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-web-config\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.195543 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.195362 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-config\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.195543 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.195422 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.195543 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.195473 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b64d7f5a-30af-486f-af17-7e12f3783d7d-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.195543 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.195505 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b64d7f5a-30af-486f-af17-7e12f3783d7d-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.195543 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.195532 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b64d7f5a-30af-486f-af17-7e12f3783d7d-config-out\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.195715 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.195557 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b64d7f5a-30af-486f-af17-7e12f3783d7d-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.195715 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.195573 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt875\" (UniqueName: \"kubernetes.io/projected/b64d7f5a-30af-486f-af17-7e12f3783d7d-kube-api-access-lt875\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.195715 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.195597 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.195715 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.195618 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b64d7f5a-30af-486f-af17-7e12f3783d7d-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.195715 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.195645 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.296834 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.296805 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b64d7f5a-30af-486f-af17-7e12f3783d7d-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.296996 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.296843 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.296996 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.296869 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.296996 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.296899 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b64d7f5a-30af-486f-af17-7e12f3783d7d-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.296996 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.296925 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b64d7f5a-30af-486f-af17-7e12f3783d7d-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.296996 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.296942 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b64d7f5a-30af-486f-af17-7e12f3783d7d-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.296996 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.296968 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.297304 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.297139 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.297304 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.297191 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.297304 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.297222 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-web-config\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.297304 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.297261 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-config\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.297304 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.297287 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.297549 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.297310 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b64d7f5a-30af-486f-af17-7e12f3783d7d-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.297549 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.297336 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b64d7f5a-30af-486f-af17-7e12f3783d7d-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.297549 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.297400 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b64d7f5a-30af-486f-af17-7e12f3783d7d-config-out\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.297549 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.297429 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b64d7f5a-30af-486f-af17-7e12f3783d7d-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.297549 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.297454 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lt875\" (UniqueName: \"kubernetes.io/projected/b64d7f5a-30af-486f-af17-7e12f3783d7d-kube-api-access-lt875\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.297549 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.297494 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.298036 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.298011 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b64d7f5a-30af-486f-af17-7e12f3783d7d-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.298125 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.298094 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b64d7f5a-30af-486f-af17-7e12f3783d7d-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.299035 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.298691 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b64d7f5a-30af-486f-af17-7e12f3783d7d-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.299271 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.299234 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b64d7f5a-30af-486f-af17-7e12f3783d7d-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.299516 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.299487 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b64d7f5a-30af-486f-af17-7e12f3783d7d-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.300205 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.300174 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.300542 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.300495 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.301521 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.301476 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.301609 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.301593 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b64d7f5a-30af-486f-af17-7e12f3783d7d-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.301850 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.301827 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b64d7f5a-30af-486f-af17-7e12f3783d7d-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.301944 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.301923 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.302264 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.302243 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.302564 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.302533 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.302899 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.302839 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b64d7f5a-30af-486f-af17-7e12f3783d7d-config-out\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.303631 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.303603 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-config\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.304240 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.304221 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-web-config\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.304443 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.304420 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b64d7f5a-30af-486f-af17-7e12f3783d7d-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.307752 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.307710 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt875\" (UniqueName: \"kubernetes.io/projected/b64d7f5a-30af-486f-af17-7e12f3783d7d-kube-api-access-lt875\") pod \"prometheus-k8s-0\" (UID: \"b64d7f5a-30af-486f-af17-7e12f3783d7d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.450106 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.450029 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:07.614061 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.614028 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:18:07.761217 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.760016 2539 generic.go:358] "Generic (PLEG): container finished" podID="b64d7f5a-30af-486f-af17-7e12f3783d7d" containerID="9a4879b53e4998c813dcfb690949d08271ef9805422a3a98b0419ed003f069f3" exitCode=0 Apr 28 19:18:07.761217 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.760114 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b64d7f5a-30af-486f-af17-7e12f3783d7d","Type":"ContainerDied","Data":"9a4879b53e4998c813dcfb690949d08271ef9805422a3a98b0419ed003f069f3"} Apr 28 19:18:07.761217 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:07.760143 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b64d7f5a-30af-486f-af17-7e12f3783d7d","Type":"ContainerStarted","Data":"003d16585362219084f4c33baac80b47ceb349b527240ff7ca2ccd859623a3ca"} Apr 28 19:18:08.271585 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:08.271545 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa8c3570-cf87-4d22-976e-06f4436ea317" path="/var/lib/kubelet/pods/aa8c3570-cf87-4d22-976e-06f4436ea317/volumes" Apr 28 19:18:08.766809 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:08.766772 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b64d7f5a-30af-486f-af17-7e12f3783d7d","Type":"ContainerStarted","Data":"186278798b9b2de4dcbcd3d71685650050ddf9eccb299c5b3b170d58b3ed63cf"} Apr 28 19:18:08.766809 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:08.766807 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b64d7f5a-30af-486f-af17-7e12f3783d7d","Type":"ContainerStarted","Data":"cdd6d6303723ed1db03b39593eb2892a9955853ef814043ea9d80922ede58d1e"} Apr 28 19:18:08.766809 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:08.766816 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b64d7f5a-30af-486f-af17-7e12f3783d7d","Type":"ContainerStarted","Data":"4d5d82ad0aa6ca87022058107f0c0dd298136ee00051e1a408a480daa83041a9"} Apr 28 19:18:08.767401 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:08.766824 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b64d7f5a-30af-486f-af17-7e12f3783d7d","Type":"ContainerStarted","Data":"8a3c189498ce69a4a8be8cd8c29cff3e3b0307a4cf68477e9289ff7d03afde23"} Apr 28 19:18:08.767401 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:08.766833 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b64d7f5a-30af-486f-af17-7e12f3783d7d","Type":"ContainerStarted","Data":"743808eeb3a10216749a9f137eb900e48930bf31d33fa8f52ee65a01ac680626"} Apr 28 19:18:08.767401 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:08.766840 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b64d7f5a-30af-486f-af17-7e12f3783d7d","Type":"ContainerStarted","Data":"bc8140b0068eb7ade1101247bb09eb59bded821e15c0d208485fbee601a2eed8"} Apr 28 19:18:08.796902 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:08.796854 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.7968350339999999 podStartE2EDuration="1.796835034s" podCreationTimestamp="2026-04-28 19:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:18:08.795574081 +0000 UTC m=+115.091929447" watchObservedRunningTime="2026-04-28 19:18:08.796835034 +0000 UTC m=+115.093190399" Apr 28 19:18:12.450304 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:18:12.450261 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:07.451003 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:07.450949 2539 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:07.465783 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:07.465741 2539 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:07.939360 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:07.939337 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:25.754859 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:25.754827 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-l4s6p"] Apr 28 19:19:25.756687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:25.756669 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-l4s6p" Apr 28 19:19:25.760757 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:25.760731 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 28 19:19:25.761039 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:25.761020 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 28 19:19:25.761090 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:25.761064 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 28 19:19:25.761136 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:25.761020 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 28 19:19:25.761136 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:25.761117 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-rcjxh\"" Apr 28 19:19:25.761136 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:25.761120 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 28 19:19:25.772342 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:25.772300 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-l4s6p"] Apr 28 19:19:25.881239 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:25.881206 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e040278a-19a5-4d1e-8d10-86a4c780c677-certificates\") pod \"keda-operator-ffbb595cb-l4s6p\" (UID: \"e040278a-19a5-4d1e-8d10-86a4c780c677\") " pod="openshift-keda/keda-operator-ffbb595cb-l4s6p" Apr 28 19:19:25.881468 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:25.881264 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd4v6\" (UniqueName: \"kubernetes.io/projected/e040278a-19a5-4d1e-8d10-86a4c780c677-kube-api-access-fd4v6\") pod \"keda-operator-ffbb595cb-l4s6p\" (UID: \"e040278a-19a5-4d1e-8d10-86a4c780c677\") " pod="openshift-keda/keda-operator-ffbb595cb-l4s6p" Apr 28 19:19:25.881468 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:25.881405 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/e040278a-19a5-4d1e-8d10-86a4c780c677-cabundle0\") pod \"keda-operator-ffbb595cb-l4s6p\" (UID: \"e040278a-19a5-4d1e-8d10-86a4c780c677\") " pod="openshift-keda/keda-operator-ffbb595cb-l4s6p" Apr 28 19:19:25.982660 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:25.982622 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/e040278a-19a5-4d1e-8d10-86a4c780c677-cabundle0\") pod \"keda-operator-ffbb595cb-l4s6p\" (UID: \"e040278a-19a5-4d1e-8d10-86a4c780c677\") " pod="openshift-keda/keda-operator-ffbb595cb-l4s6p" Apr 28 19:19:25.982660 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:25.982665 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e040278a-19a5-4d1e-8d10-86a4c780c677-certificates\") pod \"keda-operator-ffbb595cb-l4s6p\" (UID: \"e040278a-19a5-4d1e-8d10-86a4c780c677\") " pod="openshift-keda/keda-operator-ffbb595cb-l4s6p" Apr 28 19:19:25.982866 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:25.982689 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd4v6\" (UniqueName: \"kubernetes.io/projected/e040278a-19a5-4d1e-8d10-86a4c780c677-kube-api-access-fd4v6\") pod \"keda-operator-ffbb595cb-l4s6p\" (UID: \"e040278a-19a5-4d1e-8d10-86a4c780c677\") " pod="openshift-keda/keda-operator-ffbb595cb-l4s6p" Apr 28 19:19:25.982866 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:19:25.982776 2539 secret.go:281] references non-existent secret key: ca.crt Apr 28 19:19:25.982866 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:19:25.982796 2539 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 28 19:19:25.982866 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:19:25.982808 2539 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-l4s6p: references non-existent secret key: ca.crt Apr 28 19:19:25.983003 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:19:25.982879 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e040278a-19a5-4d1e-8d10-86a4c780c677-certificates podName:e040278a-19a5-4d1e-8d10-86a4c780c677 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:26.482858569 +0000 UTC m=+192.779213930 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e040278a-19a5-4d1e-8d10-86a4c780c677-certificates") pod "keda-operator-ffbb595cb-l4s6p" (UID: "e040278a-19a5-4d1e-8d10-86a4c780c677") : references non-existent secret key: ca.crt Apr 28 19:19:25.983329 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:25.983311 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/e040278a-19a5-4d1e-8d10-86a4c780c677-cabundle0\") pod \"keda-operator-ffbb595cb-l4s6p\" (UID: \"e040278a-19a5-4d1e-8d10-86a4c780c677\") " pod="openshift-keda/keda-operator-ffbb595cb-l4s6p" Apr 28 19:19:25.995689 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:25.995657 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd4v6\" (UniqueName: \"kubernetes.io/projected/e040278a-19a5-4d1e-8d10-86a4c780c677-kube-api-access-fd4v6\") pod \"keda-operator-ffbb595cb-l4s6p\" (UID: \"e040278a-19a5-4d1e-8d10-86a4c780c677\") " pod="openshift-keda/keda-operator-ffbb595cb-l4s6p" Apr 28 19:19:26.487598 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:26.487562 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e040278a-19a5-4d1e-8d10-86a4c780c677-certificates\") pod \"keda-operator-ffbb595cb-l4s6p\" (UID: \"e040278a-19a5-4d1e-8d10-86a4c780c677\") " pod="openshift-keda/keda-operator-ffbb595cb-l4s6p" Apr 28 19:19:26.487769 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:19:26.487682 2539 secret.go:281] references non-existent secret key: ca.crt Apr 28 19:19:26.487769 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:19:26.487694 2539 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 28 19:19:26.487769 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:19:26.487702 2539 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-l4s6p: references non-existent secret key: ca.crt Apr 28 19:19:26.487769 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:19:26.487756 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e040278a-19a5-4d1e-8d10-86a4c780c677-certificates podName:e040278a-19a5-4d1e-8d10-86a4c780c677 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:27.487743236 +0000 UTC m=+193.784098579 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e040278a-19a5-4d1e-8d10-86a4c780c677-certificates") pod "keda-operator-ffbb595cb-l4s6p" (UID: "e040278a-19a5-4d1e-8d10-86a4c780c677") : references non-existent secret key: ca.crt Apr 28 19:19:26.492361 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:26.492335 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-zx424"] Apr 28 19:19:26.494361 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:26.494346 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-zx424" Apr 28 19:19:26.496636 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:26.496605 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 28 19:19:26.512496 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:26.512463 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-zx424"] Apr 28 19:19:26.588780 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:26.588742 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/03b6af0e-f83d-482a-bca0-daa2af4cb964-certificates\") pod \"keda-admission-cf49989db-zx424\" (UID: \"03b6af0e-f83d-482a-bca0-daa2af4cb964\") " pod="openshift-keda/keda-admission-cf49989db-zx424" Apr 28 19:19:26.588958 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:26.588794 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqlwx\" (UniqueName: \"kubernetes.io/projected/03b6af0e-f83d-482a-bca0-daa2af4cb964-kube-api-access-gqlwx\") pod \"keda-admission-cf49989db-zx424\" (UID: \"03b6af0e-f83d-482a-bca0-daa2af4cb964\") " pod="openshift-keda/keda-admission-cf49989db-zx424" Apr 28 19:19:26.689928 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:26.689894 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/03b6af0e-f83d-482a-bca0-daa2af4cb964-certificates\") pod \"keda-admission-cf49989db-zx424\" (UID: \"03b6af0e-f83d-482a-bca0-daa2af4cb964\") " pod="openshift-keda/keda-admission-cf49989db-zx424" Apr 28 19:19:26.690088 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:26.689939 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqlwx\" (UniqueName: \"kubernetes.io/projected/03b6af0e-f83d-482a-bca0-daa2af4cb964-kube-api-access-gqlwx\") pod \"keda-admission-cf49989db-zx424\" (UID: \"03b6af0e-f83d-482a-bca0-daa2af4cb964\") " pod="openshift-keda/keda-admission-cf49989db-zx424" Apr 28 19:19:26.690088 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:19:26.690054 2539 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 28 19:19:26.690088 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:19:26.690087 2539 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-zx424: secret "keda-admission-webhooks-certs" not found Apr 28 19:19:26.690191 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:19:26.690141 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/03b6af0e-f83d-482a-bca0-daa2af4cb964-certificates podName:03b6af0e-f83d-482a-bca0-daa2af4cb964 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:27.190124654 +0000 UTC m=+193.486479996 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/03b6af0e-f83d-482a-bca0-daa2af4cb964-certificates") pod "keda-admission-cf49989db-zx424" (UID: "03b6af0e-f83d-482a-bca0-daa2af4cb964") : secret "keda-admission-webhooks-certs" not found Apr 28 19:19:26.702950 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:26.702915 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqlwx\" (UniqueName: \"kubernetes.io/projected/03b6af0e-f83d-482a-bca0-daa2af4cb964-kube-api-access-gqlwx\") pod \"keda-admission-cf49989db-zx424\" (UID: \"03b6af0e-f83d-482a-bca0-daa2af4cb964\") " pod="openshift-keda/keda-admission-cf49989db-zx424" Apr 28 19:19:27.194306 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:27.194266 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/03b6af0e-f83d-482a-bca0-daa2af4cb964-certificates\") pod \"keda-admission-cf49989db-zx424\" (UID: \"03b6af0e-f83d-482a-bca0-daa2af4cb964\") " pod="openshift-keda/keda-admission-cf49989db-zx424" Apr 28 19:19:27.196765 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:27.196740 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/03b6af0e-f83d-482a-bca0-daa2af4cb964-certificates\") pod \"keda-admission-cf49989db-zx424\" (UID: \"03b6af0e-f83d-482a-bca0-daa2af4cb964\") " pod="openshift-keda/keda-admission-cf49989db-zx424" Apr 28 19:19:27.404357 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:27.404322 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-zx424" Apr 28 19:19:27.496103 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:27.496055 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e040278a-19a5-4d1e-8d10-86a4c780c677-certificates\") pod \"keda-operator-ffbb595cb-l4s6p\" (UID: \"e040278a-19a5-4d1e-8d10-86a4c780c677\") " pod="openshift-keda/keda-operator-ffbb595cb-l4s6p" Apr 28 19:19:27.496244 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:19:27.496223 2539 secret.go:281] references non-existent secret key: ca.crt Apr 28 19:19:27.496285 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:19:27.496252 2539 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 28 19:19:27.496285 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:19:27.496264 2539 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-l4s6p: references non-existent secret key: ca.crt Apr 28 19:19:27.496358 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:19:27.496335 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e040278a-19a5-4d1e-8d10-86a4c780c677-certificates podName:e040278a-19a5-4d1e-8d10-86a4c780c677 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:29.49631526 +0000 UTC m=+195.792670627 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e040278a-19a5-4d1e-8d10-86a4c780c677-certificates") pod "keda-operator-ffbb595cb-l4s6p" (UID: "e040278a-19a5-4d1e-8d10-86a4c780c677") : references non-existent secret key: ca.crt Apr 28 19:19:27.522395 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:27.522347 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-zx424"] Apr 28 19:19:27.524567 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:19:27.524539 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03b6af0e_f83d_482a_bca0_daa2af4cb964.slice/crio-51f226b93c7df70449f234011a899c3ef79608bcf9e49b7d7568e63f6832474d WatchSource:0}: Error finding container 51f226b93c7df70449f234011a899c3ef79608bcf9e49b7d7568e63f6832474d: Status 404 returned error can't find the container with id 51f226b93c7df70449f234011a899c3ef79608bcf9e49b7d7568e63f6832474d Apr 28 19:19:27.979542 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:27.979503 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-zx424" event={"ID":"03b6af0e-f83d-482a-bca0-daa2af4cb964","Type":"ContainerStarted","Data":"51f226b93c7df70449f234011a899c3ef79608bcf9e49b7d7568e63f6832474d"} Apr 28 19:19:29.514849 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:29.514812 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e040278a-19a5-4d1e-8d10-86a4c780c677-certificates\") pod \"keda-operator-ffbb595cb-l4s6p\" (UID: \"e040278a-19a5-4d1e-8d10-86a4c780c677\") " pod="openshift-keda/keda-operator-ffbb595cb-l4s6p" Apr 28 19:19:29.515258 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:19:29.514965 2539 secret.go:281] references non-existent secret key: ca.crt Apr 28 19:19:29.515258 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:19:29.514989 2539 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 28 19:19:29.515258 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:19:29.515013 2539 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-l4s6p: references non-existent secret key: ca.crt Apr 28 19:19:29.515258 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:19:29.515065 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e040278a-19a5-4d1e-8d10-86a4c780c677-certificates podName:e040278a-19a5-4d1e-8d10-86a4c780c677 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:33.51505108 +0000 UTC m=+199.811406422 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e040278a-19a5-4d1e-8d10-86a4c780c677-certificates") pod "keda-operator-ffbb595cb-l4s6p" (UID: "e040278a-19a5-4d1e-8d10-86a4c780c677") : references non-existent secret key: ca.crt Apr 28 19:19:29.986580 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:29.986541 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-zx424" event={"ID":"03b6af0e-f83d-482a-bca0-daa2af4cb964","Type":"ContainerStarted","Data":"6e53ad81a1a546d5821eff92e6c17bfd8cec03517cbb36c5f0f326f4634a41a7"} Apr 28 19:19:29.986747 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:29.986666 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-zx424" Apr 28 19:19:30.007014 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:30.006968 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-zx424" podStartSLOduration=1.697828825 podStartE2EDuration="4.006953206s" podCreationTimestamp="2026-04-28 19:19:26 +0000 UTC" firstStartedPulling="2026-04-28 19:19:27.52574839 +0000 UTC m=+193.822103731" lastFinishedPulling="2026-04-28 19:19:29.83487277 +0000 UTC m=+196.131228112" observedRunningTime="2026-04-28 19:19:30.005426123 +0000 UTC m=+196.301781487" watchObservedRunningTime="2026-04-28 19:19:30.006953206 +0000 UTC m=+196.303308570" Apr 28 19:19:33.545191 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:33.545152 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e040278a-19a5-4d1e-8d10-86a4c780c677-certificates\") pod \"keda-operator-ffbb595cb-l4s6p\" (UID: \"e040278a-19a5-4d1e-8d10-86a4c780c677\") " pod="openshift-keda/keda-operator-ffbb595cb-l4s6p" Apr 28 19:19:33.547516 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:33.547496 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e040278a-19a5-4d1e-8d10-86a4c780c677-certificates\") pod \"keda-operator-ffbb595cb-l4s6p\" (UID: \"e040278a-19a5-4d1e-8d10-86a4c780c677\") " pod="openshift-keda/keda-operator-ffbb595cb-l4s6p" Apr 28 19:19:33.566514 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:33.566485 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-l4s6p" Apr 28 19:19:33.690084 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:33.690054 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-l4s6p"] Apr 28 19:19:33.693946 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:19:33.693916 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode040278a_19a5_4d1e_8d10_86a4c780c677.slice/crio-29cd702662ec4381cec1ee4e35e62ec67e9a74dbe0ac12468cdea3dd0f237860 WatchSource:0}: Error finding container 29cd702662ec4381cec1ee4e35e62ec67e9a74dbe0ac12468cdea3dd0f237860: Status 404 returned error can't find the container with id 29cd702662ec4381cec1ee4e35e62ec67e9a74dbe0ac12468cdea3dd0f237860 Apr 28 19:19:33.997965 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:33.997933 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-l4s6p" event={"ID":"e040278a-19a5-4d1e-8d10-86a4c780c677","Type":"ContainerStarted","Data":"29cd702662ec4381cec1ee4e35e62ec67e9a74dbe0ac12468cdea3dd0f237860"} Apr 28 19:19:38.010076 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:38.010042 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-l4s6p" event={"ID":"e040278a-19a5-4d1e-8d10-86a4c780c677","Type":"ContainerStarted","Data":"56c1db252254e73d5cb4010e1beef10436618cbe40057b36b3ef1cdeb703a6a6"} Apr 28 19:19:38.010484 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:38.010193 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-l4s6p" Apr 28 19:19:50.992303 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:50.992275 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-zx424" Apr 28 19:19:51.013652 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:51.013589 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-l4s6p" podStartSLOduration=22.236104229 podStartE2EDuration="26.013569543s" podCreationTimestamp="2026-04-28 19:19:25 +0000 UTC" firstStartedPulling="2026-04-28 19:19:33.695664583 +0000 UTC m=+199.992019928" lastFinishedPulling="2026-04-28 19:19:37.473129895 +0000 UTC m=+203.769485242" observedRunningTime="2026-04-28 19:19:38.035592798 +0000 UTC m=+204.331948187" watchObservedRunningTime="2026-04-28 19:19:51.013569543 +0000 UTC m=+217.309924920" Apr 28 19:19:59.014911 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:19:59.014882 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-l4s6p" Apr 28 19:20:32.759100 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:20:32.759062 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-b85c69797-8n7bc"] Apr 28 19:20:32.761141 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:20:32.761122 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-8n7bc" Apr 28 19:20:32.764930 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:20:32.764907 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 28 19:20:32.765746 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:20:32.765707 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-4whvb\"" Apr 28 19:20:32.765746 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:20:32.765707 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 28 19:20:32.765875 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:20:32.765710 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 28 19:20:32.773723 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:20:32.773705 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-8n7bc"] Apr 28 19:20:32.799839 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:20:32.799813 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41937fc5-7383-45cb-8790-7336565e98c3-cert\") pod \"kserve-controller-manager-b85c69797-8n7bc\" (UID: \"41937fc5-7383-45cb-8790-7336565e98c3\") " pod="kserve/kserve-controller-manager-b85c69797-8n7bc" Apr 28 19:20:32.799986 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:20:32.799851 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rm4d\" (UniqueName: \"kubernetes.io/projected/41937fc5-7383-45cb-8790-7336565e98c3-kube-api-access-7rm4d\") pod \"kserve-controller-manager-b85c69797-8n7bc\" (UID: \"41937fc5-7383-45cb-8790-7336565e98c3\") " pod="kserve/kserve-controller-manager-b85c69797-8n7bc" Apr 28 19:20:32.900641 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:20:32.900603 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41937fc5-7383-45cb-8790-7336565e98c3-cert\") pod \"kserve-controller-manager-b85c69797-8n7bc\" (UID: \"41937fc5-7383-45cb-8790-7336565e98c3\") " pod="kserve/kserve-controller-manager-b85c69797-8n7bc" Apr 28 19:20:32.900806 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:20:32.900649 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7rm4d\" (UniqueName: \"kubernetes.io/projected/41937fc5-7383-45cb-8790-7336565e98c3-kube-api-access-7rm4d\") pod \"kserve-controller-manager-b85c69797-8n7bc\" (UID: \"41937fc5-7383-45cb-8790-7336565e98c3\") " pod="kserve/kserve-controller-manager-b85c69797-8n7bc" Apr 28 19:20:32.902934 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:20:32.902915 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41937fc5-7383-45cb-8790-7336565e98c3-cert\") pod \"kserve-controller-manager-b85c69797-8n7bc\" (UID: \"41937fc5-7383-45cb-8790-7336565e98c3\") " pod="kserve/kserve-controller-manager-b85c69797-8n7bc" Apr 28 19:20:32.909246 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:20:32.909222 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rm4d\" (UniqueName: \"kubernetes.io/projected/41937fc5-7383-45cb-8790-7336565e98c3-kube-api-access-7rm4d\") pod \"kserve-controller-manager-b85c69797-8n7bc\" (UID: \"41937fc5-7383-45cb-8790-7336565e98c3\") " pod="kserve/kserve-controller-manager-b85c69797-8n7bc" Apr 28 19:20:33.070792 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:20:33.070707 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-8n7bc" Apr 28 19:20:33.196925 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:20:33.196870 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-8n7bc"] Apr 28 19:20:33.199543 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:20:33.199517 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41937fc5_7383_45cb_8790_7336565e98c3.slice/crio-3b2311c5bbbd0cddcfb990ce00fff5a0a66d2bc79fbd4af966f7bfcb24a96634 WatchSource:0}: Error finding container 3b2311c5bbbd0cddcfb990ce00fff5a0a66d2bc79fbd4af966f7bfcb24a96634: Status 404 returned error can't find the container with id 3b2311c5bbbd0cddcfb990ce00fff5a0a66d2bc79fbd4af966f7bfcb24a96634 Apr 28 19:20:34.159504 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:20:34.159468 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-8n7bc" event={"ID":"41937fc5-7383-45cb-8790-7336565e98c3","Type":"ContainerStarted","Data":"3b2311c5bbbd0cddcfb990ce00fff5a0a66d2bc79fbd4af966f7bfcb24a96634"} Apr 28 19:20:37.168312 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:20:37.168278 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-8n7bc" event={"ID":"41937fc5-7383-45cb-8790-7336565e98c3","Type":"ContainerStarted","Data":"e2c75ab40ed3ae8c55416ed77f19b8b6fd5d58370ea424402fb375fb7016e8ff"} Apr 28 19:20:37.168702 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:20:37.168400 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-b85c69797-8n7bc" Apr 28 19:21:08.176753 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:08.176721 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-b85c69797-8n7bc" Apr 28 19:21:08.195241 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:08.195191 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-b85c69797-8n7bc" podStartSLOduration=32.442005997 podStartE2EDuration="36.195176942s" podCreationTimestamp="2026-04-28 19:20:32 +0000 UTC" firstStartedPulling="2026-04-28 19:20:33.200717696 +0000 UTC m=+259.497073041" lastFinishedPulling="2026-04-28 19:20:36.953888636 +0000 UTC m=+263.250243986" observedRunningTime="2026-04-28 19:20:37.188634907 +0000 UTC m=+263.484990269" watchObservedRunningTime="2026-04-28 19:21:08.195176942 +0000 UTC m=+294.491532305" Apr 28 19:21:10.198275 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:10.198236 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-8n7bc"] Apr 28 19:21:10.198867 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:10.198539 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-b85c69797-8n7bc" podUID="41937fc5-7383-45cb-8790-7336565e98c3" containerName="manager" containerID="cri-o://e2c75ab40ed3ae8c55416ed77f19b8b6fd5d58370ea424402fb375fb7016e8ff" gracePeriod=10 Apr 28 19:21:10.270506 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:10.270471 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-b85c69797-jx7z7"] Apr 28 19:21:10.272386 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:10.272361 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-jx7z7" Apr 28 19:21:10.307804 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:10.307766 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-jx7z7"] Apr 28 19:21:10.393076 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:10.393044 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94647eb0-60e0-4a5e-a906-7035ffdc4738-cert\") pod \"kserve-controller-manager-b85c69797-jx7z7\" (UID: \"94647eb0-60e0-4a5e-a906-7035ffdc4738\") " pod="kserve/kserve-controller-manager-b85c69797-jx7z7" Apr 28 19:21:10.393076 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:10.393080 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nmcb\" (UniqueName: \"kubernetes.io/projected/94647eb0-60e0-4a5e-a906-7035ffdc4738-kube-api-access-6nmcb\") pod \"kserve-controller-manager-b85c69797-jx7z7\" (UID: \"94647eb0-60e0-4a5e-a906-7035ffdc4738\") " pod="kserve/kserve-controller-manager-b85c69797-jx7z7" Apr 28 19:21:10.428394 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:10.428354 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-8n7bc" Apr 28 19:21:10.493640 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:10.493551 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94647eb0-60e0-4a5e-a906-7035ffdc4738-cert\") pod \"kserve-controller-manager-b85c69797-jx7z7\" (UID: \"94647eb0-60e0-4a5e-a906-7035ffdc4738\") " pod="kserve/kserve-controller-manager-b85c69797-jx7z7" Apr 28 19:21:10.493640 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:10.493585 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nmcb\" (UniqueName: \"kubernetes.io/projected/94647eb0-60e0-4a5e-a906-7035ffdc4738-kube-api-access-6nmcb\") pod \"kserve-controller-manager-b85c69797-jx7z7\" (UID: \"94647eb0-60e0-4a5e-a906-7035ffdc4738\") " pod="kserve/kserve-controller-manager-b85c69797-jx7z7" Apr 28 19:21:10.495987 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:10.495960 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94647eb0-60e0-4a5e-a906-7035ffdc4738-cert\") pod \"kserve-controller-manager-b85c69797-jx7z7\" (UID: \"94647eb0-60e0-4a5e-a906-7035ffdc4738\") " pod="kserve/kserve-controller-manager-b85c69797-jx7z7" Apr 28 19:21:10.520570 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:10.520539 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nmcb\" (UniqueName: \"kubernetes.io/projected/94647eb0-60e0-4a5e-a906-7035ffdc4738-kube-api-access-6nmcb\") pod \"kserve-controller-manager-b85c69797-jx7z7\" (UID: \"94647eb0-60e0-4a5e-a906-7035ffdc4738\") " pod="kserve/kserve-controller-manager-b85c69797-jx7z7" Apr 28 19:21:10.581335 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:10.581296 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-jx7z7" Apr 28 19:21:10.594644 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:10.594618 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41937fc5-7383-45cb-8790-7336565e98c3-cert\") pod \"41937fc5-7383-45cb-8790-7336565e98c3\" (UID: \"41937fc5-7383-45cb-8790-7336565e98c3\") " Apr 28 19:21:10.594751 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:10.594699 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rm4d\" (UniqueName: \"kubernetes.io/projected/41937fc5-7383-45cb-8790-7336565e98c3-kube-api-access-7rm4d\") pod \"41937fc5-7383-45cb-8790-7336565e98c3\" (UID: \"41937fc5-7383-45cb-8790-7336565e98c3\") " Apr 28 19:21:10.596769 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:10.596742 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41937fc5-7383-45cb-8790-7336565e98c3-kube-api-access-7rm4d" (OuterVolumeSpecName: "kube-api-access-7rm4d") pod "41937fc5-7383-45cb-8790-7336565e98c3" (UID: "41937fc5-7383-45cb-8790-7336565e98c3"). InnerVolumeSpecName "kube-api-access-7rm4d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:21:10.596863 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:10.596793 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41937fc5-7383-45cb-8790-7336565e98c3-cert" (OuterVolumeSpecName: "cert") pod "41937fc5-7383-45cb-8790-7336565e98c3" (UID: "41937fc5-7383-45cb-8790-7336565e98c3"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:21:10.695682 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:10.695654 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7rm4d\" (UniqueName: \"kubernetes.io/projected/41937fc5-7383-45cb-8790-7336565e98c3-kube-api-access-7rm4d\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:21:10.695682 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:10.695679 2539 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41937fc5-7383-45cb-8790-7336565e98c3-cert\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:21:10.707890 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:10.707862 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-jx7z7"] Apr 28 19:21:10.717717 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:21:10.717688 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94647eb0_60e0_4a5e_a906_7035ffdc4738.slice/crio-e26d9af75815d1c54771491d732a8a12ea7be234b92fe8cfa848bfcd64270296 WatchSource:0}: Error finding container e26d9af75815d1c54771491d732a8a12ea7be234b92fe8cfa848bfcd64270296: Status 404 returned error can't find the container with id e26d9af75815d1c54771491d732a8a12ea7be234b92fe8cfa848bfcd64270296 Apr 28 19:21:11.258036 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:11.258000 2539 generic.go:358] "Generic (PLEG): container finished" podID="41937fc5-7383-45cb-8790-7336565e98c3" containerID="e2c75ab40ed3ae8c55416ed77f19b8b6fd5d58370ea424402fb375fb7016e8ff" exitCode=0 Apr 28 19:21:11.258481 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:11.258066 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-8n7bc" event={"ID":"41937fc5-7383-45cb-8790-7336565e98c3","Type":"ContainerDied","Data":"e2c75ab40ed3ae8c55416ed77f19b8b6fd5d58370ea424402fb375fb7016e8ff"} Apr 28 19:21:11.258481 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:11.258094 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-8n7bc" event={"ID":"41937fc5-7383-45cb-8790-7336565e98c3","Type":"ContainerDied","Data":"3b2311c5bbbd0cddcfb990ce00fff5a0a66d2bc79fbd4af966f7bfcb24a96634"} Apr 28 19:21:11.258481 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:11.258068 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-8n7bc" Apr 28 19:21:11.258481 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:11.258116 2539 scope.go:117] "RemoveContainer" containerID="e2c75ab40ed3ae8c55416ed77f19b8b6fd5d58370ea424402fb375fb7016e8ff" Apr 28 19:21:11.259630 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:11.259598 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-jx7z7" event={"ID":"94647eb0-60e0-4a5e-a906-7035ffdc4738","Type":"ContainerStarted","Data":"934819392a974f1086d6a98ea97ad663f607fec472fb5760c29ede70a24e1f9b"} Apr 28 19:21:11.259742 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:11.259635 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-jx7z7" event={"ID":"94647eb0-60e0-4a5e-a906-7035ffdc4738","Type":"ContainerStarted","Data":"e26d9af75815d1c54771491d732a8a12ea7be234b92fe8cfa848bfcd64270296"} Apr 28 19:21:11.259783 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:11.259748 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-b85c69797-jx7z7" Apr 28 19:21:11.266252 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:11.266222 2539 scope.go:117] "RemoveContainer" containerID="e2c75ab40ed3ae8c55416ed77f19b8b6fd5d58370ea424402fb375fb7016e8ff" Apr 28 19:21:11.266548 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:21:11.266523 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2c75ab40ed3ae8c55416ed77f19b8b6fd5d58370ea424402fb375fb7016e8ff\": container with ID starting with e2c75ab40ed3ae8c55416ed77f19b8b6fd5d58370ea424402fb375fb7016e8ff not found: ID does not exist" containerID="e2c75ab40ed3ae8c55416ed77f19b8b6fd5d58370ea424402fb375fb7016e8ff" Apr 28 19:21:11.266641 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:11.266557 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2c75ab40ed3ae8c55416ed77f19b8b6fd5d58370ea424402fb375fb7016e8ff"} err="failed to get container status \"e2c75ab40ed3ae8c55416ed77f19b8b6fd5d58370ea424402fb375fb7016e8ff\": rpc error: code = NotFound desc = could not find container \"e2c75ab40ed3ae8c55416ed77f19b8b6fd5d58370ea424402fb375fb7016e8ff\": container with ID starting with e2c75ab40ed3ae8c55416ed77f19b8b6fd5d58370ea424402fb375fb7016e8ff not found: ID does not exist" Apr 28 19:21:11.292478 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:11.292435 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-b85c69797-jx7z7" podStartSLOduration=0.943720111 podStartE2EDuration="1.292421467s" podCreationTimestamp="2026-04-28 19:21:10 +0000 UTC" firstStartedPulling="2026-04-28 19:21:10.718933997 +0000 UTC m=+297.015289338" lastFinishedPulling="2026-04-28 19:21:11.067635353 +0000 UTC m=+297.363990694" observedRunningTime="2026-04-28 19:21:11.287187355 +0000 UTC m=+297.583542719" watchObservedRunningTime="2026-04-28 19:21:11.292421467 +0000 UTC m=+297.588776831" Apr 28 19:21:11.322598 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:11.322527 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-8n7bc"] Apr 28 19:21:11.327938 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:11.327904 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-8n7bc"] Apr 28 19:21:12.271257 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:12.271217 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41937fc5-7383-45cb-8790-7336565e98c3" path="/var/lib/kubelet/pods/41937fc5-7383-45cb-8790-7336565e98c3/volumes" Apr 28 19:21:14.179761 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:14.179730 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 19:21:14.181068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:14.181044 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 19:21:14.186974 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:14.186957 2539 kubelet.go:1628] "Image garbage collection succeeded" Apr 28 19:21:42.270990 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:42.270954 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-b85c69797-jx7z7" Apr 28 19:21:43.182072 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:43.182033 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-2hwb9"] Apr 28 19:21:43.182459 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:43.182440 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41937fc5-7383-45cb-8790-7336565e98c3" containerName="manager" Apr 28 19:21:43.182459 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:43.182459 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="41937fc5-7383-45cb-8790-7336565e98c3" containerName="manager" Apr 28 19:21:43.182620 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:43.182521 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="41937fc5-7383-45cb-8790-7336565e98c3" containerName="manager" Apr 28 19:21:43.184197 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:43.184178 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-2hwb9" Apr 28 19:21:43.186452 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:43.186435 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 28 19:21:43.186691 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:43.186673 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-mmrwt\"" Apr 28 19:21:43.194853 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:43.194832 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-2hwb9"] Apr 28 19:21:43.243039 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:43.243004 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7hsj\" (UniqueName: \"kubernetes.io/projected/1bf14349-ef4e-45fd-bb5d-ce0795206852-kube-api-access-t7hsj\") pod \"odh-model-controller-696fc77849-2hwb9\" (UID: \"1bf14349-ef4e-45fd-bb5d-ce0795206852\") " pod="kserve/odh-model-controller-696fc77849-2hwb9" Apr 28 19:21:43.243224 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:43.243067 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bf14349-ef4e-45fd-bb5d-ce0795206852-cert\") pod \"odh-model-controller-696fc77849-2hwb9\" (UID: \"1bf14349-ef4e-45fd-bb5d-ce0795206852\") " pod="kserve/odh-model-controller-696fc77849-2hwb9" Apr 28 19:21:43.344059 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:43.344023 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7hsj\" (UniqueName: \"kubernetes.io/projected/1bf14349-ef4e-45fd-bb5d-ce0795206852-kube-api-access-t7hsj\") pod \"odh-model-controller-696fc77849-2hwb9\" (UID: \"1bf14349-ef4e-45fd-bb5d-ce0795206852\") " pod="kserve/odh-model-controller-696fc77849-2hwb9" Apr 28 19:21:43.344559 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:43.344093 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bf14349-ef4e-45fd-bb5d-ce0795206852-cert\") pod \"odh-model-controller-696fc77849-2hwb9\" (UID: \"1bf14349-ef4e-45fd-bb5d-ce0795206852\") " pod="kserve/odh-model-controller-696fc77849-2hwb9" Apr 28 19:21:43.344559 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:21:43.344207 2539 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 28 19:21:43.344559 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:21:43.344290 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bf14349-ef4e-45fd-bb5d-ce0795206852-cert podName:1bf14349-ef4e-45fd-bb5d-ce0795206852 nodeName:}" failed. No retries permitted until 2026-04-28 19:21:43.844266376 +0000 UTC m=+330.140621723 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1bf14349-ef4e-45fd-bb5d-ce0795206852-cert") pod "odh-model-controller-696fc77849-2hwb9" (UID: "1bf14349-ef4e-45fd-bb5d-ce0795206852") : secret "odh-model-controller-webhook-cert" not found Apr 28 19:21:43.352874 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:43.352839 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7hsj\" (UniqueName: \"kubernetes.io/projected/1bf14349-ef4e-45fd-bb5d-ce0795206852-kube-api-access-t7hsj\") pod \"odh-model-controller-696fc77849-2hwb9\" (UID: \"1bf14349-ef4e-45fd-bb5d-ce0795206852\") " pod="kserve/odh-model-controller-696fc77849-2hwb9" Apr 28 19:21:43.847815 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:43.847779 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bf14349-ef4e-45fd-bb5d-ce0795206852-cert\") pod \"odh-model-controller-696fc77849-2hwb9\" (UID: \"1bf14349-ef4e-45fd-bb5d-ce0795206852\") " pod="kserve/odh-model-controller-696fc77849-2hwb9" Apr 28 19:21:43.850110 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:43.850079 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bf14349-ef4e-45fd-bb5d-ce0795206852-cert\") pod \"odh-model-controller-696fc77849-2hwb9\" (UID: \"1bf14349-ef4e-45fd-bb5d-ce0795206852\") " pod="kserve/odh-model-controller-696fc77849-2hwb9" Apr 28 19:21:44.094458 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:44.094421 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-2hwb9" Apr 28 19:21:44.232422 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:44.232387 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-2hwb9"] Apr 28 19:21:44.235916 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:21:44.235886 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bf14349_ef4e_45fd_bb5d_ce0795206852.slice/crio-43ad9733de134899b927318e37134e1632622293e048c43c3390cea27b4b2086 WatchSource:0}: Error finding container 43ad9733de134899b927318e37134e1632622293e048c43c3390cea27b4b2086: Status 404 returned error can't find the container with id 43ad9733de134899b927318e37134e1632622293e048c43c3390cea27b4b2086 Apr 28 19:21:44.237197 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:44.237180 2539 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:21:44.344337 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:44.344294 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-2hwb9" event={"ID":"1bf14349-ef4e-45fd-bb5d-ce0795206852","Type":"ContainerStarted","Data":"43ad9733de134899b927318e37134e1632622293e048c43c3390cea27b4b2086"} Apr 28 19:21:48.356787 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:48.356751 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-2hwb9" event={"ID":"1bf14349-ef4e-45fd-bb5d-ce0795206852","Type":"ContainerStarted","Data":"dcfd1952f2749c26bb55361aa521d03c997b84938c9d4f6bd85c9dc85468b922"} Apr 28 19:21:48.357194 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:48.356874 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-2hwb9" Apr 28 19:21:48.376067 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:48.376013 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-2hwb9" podStartSLOduration=2.270842291 podStartE2EDuration="5.376000506s" podCreationTimestamp="2026-04-28 19:21:43 +0000 UTC" firstStartedPulling="2026-04-28 19:21:44.237338761 +0000 UTC m=+330.533694107" lastFinishedPulling="2026-04-28 19:21:47.342496976 +0000 UTC m=+333.638852322" observedRunningTime="2026-04-28 19:21:48.375020295 +0000 UTC m=+334.671375660" watchObservedRunningTime="2026-04-28 19:21:48.376000506 +0000 UTC m=+334.672355870" Apr 28 19:21:59.362081 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:21:59.362000 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-2hwb9" Apr 28 19:22:11.981264 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:11.981228 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-5w5xk"] Apr 28 19:22:11.983248 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:11.983231 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5w5xk" Apr 28 19:22:11.985770 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:11.985742 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 28 19:22:11.985963 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:11.985949 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-hdspq\"" Apr 28 19:22:11.991213 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:11.991190 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-5w5xk"] Apr 28 19:22:12.075366 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:12.075328 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfmxn\" (UniqueName: \"kubernetes.io/projected/9cc19523-7920-4384-91e1-a7f9992d0436-kube-api-access-sfmxn\") pod \"seaweedfs-tls-custom-ddd4dbfd-5w5xk\" (UID: \"9cc19523-7920-4384-91e1-a7f9992d0436\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5w5xk" Apr 28 19:22:12.075554 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:12.075414 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9cc19523-7920-4384-91e1-a7f9992d0436-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-5w5xk\" (UID: \"9cc19523-7920-4384-91e1-a7f9992d0436\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5w5xk" Apr 28 19:22:12.176027 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:12.175984 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfmxn\" (UniqueName: \"kubernetes.io/projected/9cc19523-7920-4384-91e1-a7f9992d0436-kube-api-access-sfmxn\") pod \"seaweedfs-tls-custom-ddd4dbfd-5w5xk\" (UID: \"9cc19523-7920-4384-91e1-a7f9992d0436\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5w5xk" Apr 28 19:22:12.176162 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:12.176051 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9cc19523-7920-4384-91e1-a7f9992d0436-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-5w5xk\" (UID: \"9cc19523-7920-4384-91e1-a7f9992d0436\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5w5xk" Apr 28 19:22:12.176484 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:12.176463 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9cc19523-7920-4384-91e1-a7f9992d0436-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-5w5xk\" (UID: \"9cc19523-7920-4384-91e1-a7f9992d0436\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5w5xk" Apr 28 19:22:12.188849 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:12.188807 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfmxn\" (UniqueName: \"kubernetes.io/projected/9cc19523-7920-4384-91e1-a7f9992d0436-kube-api-access-sfmxn\") pod \"seaweedfs-tls-custom-ddd4dbfd-5w5xk\" (UID: \"9cc19523-7920-4384-91e1-a7f9992d0436\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5w5xk" Apr 28 19:22:12.293597 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:12.293563 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5w5xk" Apr 28 19:22:12.422195 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:12.422164 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-5w5xk"] Apr 28 19:22:12.425578 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:22:12.425546 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cc19523_7920_4384_91e1_a7f9992d0436.slice/crio-8b3bfb245a4ea94f33288df1d2091ab4c8044d6ef00b7cf5b7c98a6433835263 WatchSource:0}: Error finding container 8b3bfb245a4ea94f33288df1d2091ab4c8044d6ef00b7cf5b7c98a6433835263: Status 404 returned error can't find the container with id 8b3bfb245a4ea94f33288df1d2091ab4c8044d6ef00b7cf5b7c98a6433835263 Apr 28 19:22:13.420004 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:13.419971 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5w5xk" event={"ID":"9cc19523-7920-4384-91e1-a7f9992d0436","Type":"ContainerStarted","Data":"8b3bfb245a4ea94f33288df1d2091ab4c8044d6ef00b7cf5b7c98a6433835263"} Apr 28 19:22:15.595078 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:15.595055 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 28 19:22:16.430128 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:16.430090 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5w5xk" event={"ID":"9cc19523-7920-4384-91e1-a7f9992d0436","Type":"ContainerStarted","Data":"11b6dc88d2005778b863f4b401798f0f8c891e388af6bf14d7bcd6b4071134e1"} Apr 28 19:22:16.461245 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:16.461192 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5w5xk" podStartSLOduration=2.295321332 podStartE2EDuration="5.461176606s" podCreationTimestamp="2026-04-28 19:22:11 +0000 UTC" firstStartedPulling="2026-04-28 19:22:12.426685587 +0000 UTC m=+358.723040930" lastFinishedPulling="2026-04-28 19:22:15.592540863 +0000 UTC m=+361.888896204" observedRunningTime="2026-04-28 19:22:16.460203061 +0000 UTC m=+362.756558424" watchObservedRunningTime="2026-04-28 19:22:16.461176606 +0000 UTC m=+362.757532009" Apr 28 19:22:17.968571 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:17.968532 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-5w5xk"] Apr 28 19:22:18.435919 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:18.435877 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5w5xk" podUID="9cc19523-7920-4384-91e1-a7f9992d0436" containerName="seaweedfs-tls-custom" containerID="cri-o://11b6dc88d2005778b863f4b401798f0f8c891e388af6bf14d7bcd6b4071134e1" gracePeriod=30 Apr 28 19:22:46.272826 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.272803 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5w5xk" Apr 28 19:22:46.344236 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.344199 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9cc19523-7920-4384-91e1-a7f9992d0436-data\") pod \"9cc19523-7920-4384-91e1-a7f9992d0436\" (UID: \"9cc19523-7920-4384-91e1-a7f9992d0436\") " Apr 28 19:22:46.344428 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.344303 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfmxn\" (UniqueName: \"kubernetes.io/projected/9cc19523-7920-4384-91e1-a7f9992d0436-kube-api-access-sfmxn\") pod \"9cc19523-7920-4384-91e1-a7f9992d0436\" (UID: \"9cc19523-7920-4384-91e1-a7f9992d0436\") " Apr 28 19:22:46.345546 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.345523 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cc19523-7920-4384-91e1-a7f9992d0436-data" (OuterVolumeSpecName: "data") pod "9cc19523-7920-4384-91e1-a7f9992d0436" (UID: "9cc19523-7920-4384-91e1-a7f9992d0436"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:22:46.346292 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.346272 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cc19523-7920-4384-91e1-a7f9992d0436-kube-api-access-sfmxn" (OuterVolumeSpecName: "kube-api-access-sfmxn") pod "9cc19523-7920-4384-91e1-a7f9992d0436" (UID: "9cc19523-7920-4384-91e1-a7f9992d0436"). InnerVolumeSpecName "kube-api-access-sfmxn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:22:46.445026 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.444935 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sfmxn\" (UniqueName: \"kubernetes.io/projected/9cc19523-7920-4384-91e1-a7f9992d0436-kube-api-access-sfmxn\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:22:46.445026 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.444970 2539 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9cc19523-7920-4384-91e1-a7f9992d0436-data\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:22:46.506420 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.506358 2539 generic.go:358] "Generic (PLEG): container finished" podID="9cc19523-7920-4384-91e1-a7f9992d0436" containerID="11b6dc88d2005778b863f4b401798f0f8c891e388af6bf14d7bcd6b4071134e1" exitCode=0 Apr 28 19:22:46.506420 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.506405 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5w5xk" event={"ID":"9cc19523-7920-4384-91e1-a7f9992d0436","Type":"ContainerDied","Data":"11b6dc88d2005778b863f4b401798f0f8c891e388af6bf14d7bcd6b4071134e1"} Apr 28 19:22:46.506650 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.506450 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5w5xk" Apr 28 19:22:46.506650 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.506453 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5w5xk" event={"ID":"9cc19523-7920-4384-91e1-a7f9992d0436","Type":"ContainerDied","Data":"8b3bfb245a4ea94f33288df1d2091ab4c8044d6ef00b7cf5b7c98a6433835263"} Apr 28 19:22:46.506650 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.506474 2539 scope.go:117] "RemoveContainer" containerID="11b6dc88d2005778b863f4b401798f0f8c891e388af6bf14d7bcd6b4071134e1" Apr 28 19:22:46.515516 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.515504 2539 scope.go:117] "RemoveContainer" containerID="11b6dc88d2005778b863f4b401798f0f8c891e388af6bf14d7bcd6b4071134e1" Apr 28 19:22:46.515766 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:22:46.515747 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11b6dc88d2005778b863f4b401798f0f8c891e388af6bf14d7bcd6b4071134e1\": container with ID starting with 11b6dc88d2005778b863f4b401798f0f8c891e388af6bf14d7bcd6b4071134e1 not found: ID does not exist" containerID="11b6dc88d2005778b863f4b401798f0f8c891e388af6bf14d7bcd6b4071134e1" Apr 28 19:22:46.515819 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.515774 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11b6dc88d2005778b863f4b401798f0f8c891e388af6bf14d7bcd6b4071134e1"} err="failed to get container status \"11b6dc88d2005778b863f4b401798f0f8c891e388af6bf14d7bcd6b4071134e1\": rpc error: code = NotFound desc = could not find container \"11b6dc88d2005778b863f4b401798f0f8c891e388af6bf14d7bcd6b4071134e1\": container with ID starting with 11b6dc88d2005778b863f4b401798f0f8c891e388af6bf14d7bcd6b4071134e1 not found: ID does not exist" Apr 28 19:22:46.527203 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.527176 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-5w5xk"] Apr 28 19:22:46.531101 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.531081 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-5w5xk"] Apr 28 19:22:46.560083 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.560060 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-6tn7v"] Apr 28 19:22:46.560360 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.560344 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9cc19523-7920-4384-91e1-a7f9992d0436" containerName="seaweedfs-tls-custom" Apr 28 19:22:46.560416 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.560361 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc19523-7920-4384-91e1-a7f9992d0436" containerName="seaweedfs-tls-custom" Apr 28 19:22:46.560450 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.560420 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="9cc19523-7920-4384-91e1-a7f9992d0436" containerName="seaweedfs-tls-custom" Apr 28 19:22:46.567294 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.567270 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6tn7v" Apr 28 19:22:46.569783 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.569761 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 28 19:22:46.569783 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.569761 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 28 19:22:46.569958 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.569765 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-hdspq\"" Apr 28 19:22:46.570552 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.570529 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-6tn7v"] Apr 28 19:22:46.646677 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.646635 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmd9j\" (UniqueName: \"kubernetes.io/projected/0fe3fc8e-4148-4958-9bc0-c0b1fb3fa5fe-kube-api-access-dmd9j\") pod \"seaweedfs-tls-custom-5c88b85bb7-6tn7v\" (UID: \"0fe3fc8e-4148-4958-9bc0-c0b1fb3fa5fe\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6tn7v" Apr 28 19:22:46.646677 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.646678 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/0fe3fc8e-4148-4958-9bc0-c0b1fb3fa5fe-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-6tn7v\" (UID: \"0fe3fc8e-4148-4958-9bc0-c0b1fb3fa5fe\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6tn7v" Apr 28 19:22:46.646916 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.646705 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0fe3fc8e-4148-4958-9bc0-c0b1fb3fa5fe-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-6tn7v\" (UID: \"0fe3fc8e-4148-4958-9bc0-c0b1fb3fa5fe\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6tn7v" Apr 28 19:22:46.748168 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.748139 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmd9j\" (UniqueName: \"kubernetes.io/projected/0fe3fc8e-4148-4958-9bc0-c0b1fb3fa5fe-kube-api-access-dmd9j\") pod \"seaweedfs-tls-custom-5c88b85bb7-6tn7v\" (UID: \"0fe3fc8e-4148-4958-9bc0-c0b1fb3fa5fe\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6tn7v" Apr 28 19:22:46.748360 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.748182 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/0fe3fc8e-4148-4958-9bc0-c0b1fb3fa5fe-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-6tn7v\" (UID: \"0fe3fc8e-4148-4958-9bc0-c0b1fb3fa5fe\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6tn7v" Apr 28 19:22:46.748360 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.748223 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0fe3fc8e-4148-4958-9bc0-c0b1fb3fa5fe-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-6tn7v\" (UID: \"0fe3fc8e-4148-4958-9bc0-c0b1fb3fa5fe\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6tn7v" Apr 28 19:22:46.748655 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.748633 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0fe3fc8e-4148-4958-9bc0-c0b1fb3fa5fe-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-6tn7v\" (UID: \"0fe3fc8e-4148-4958-9bc0-c0b1fb3fa5fe\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6tn7v" Apr 28 19:22:46.750718 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.750693 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/0fe3fc8e-4148-4958-9bc0-c0b1fb3fa5fe-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-6tn7v\" (UID: \"0fe3fc8e-4148-4958-9bc0-c0b1fb3fa5fe\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6tn7v" Apr 28 19:22:46.756879 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.756853 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmd9j\" (UniqueName: \"kubernetes.io/projected/0fe3fc8e-4148-4958-9bc0-c0b1fb3fa5fe-kube-api-access-dmd9j\") pod \"seaweedfs-tls-custom-5c88b85bb7-6tn7v\" (UID: \"0fe3fc8e-4148-4958-9bc0-c0b1fb3fa5fe\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6tn7v" Apr 28 19:22:46.876906 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.876870 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6tn7v" Apr 28 19:22:46.997977 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:46.997953 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-6tn7v"] Apr 28 19:22:47.000168 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:22:47.000142 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fe3fc8e_4148_4958_9bc0_c0b1fb3fa5fe.slice/crio-0615a726983c218513bdba3037361321b67651efa9df45fc8f75fa1079172d02 WatchSource:0}: Error finding container 0615a726983c218513bdba3037361321b67651efa9df45fc8f75fa1079172d02: Status 404 returned error can't find the container with id 0615a726983c218513bdba3037361321b67651efa9df45fc8f75fa1079172d02 Apr 28 19:22:47.510899 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:47.510860 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6tn7v" event={"ID":"0fe3fc8e-4148-4958-9bc0-c0b1fb3fa5fe","Type":"ContainerStarted","Data":"4b88eb9e7f88b85b86d9aa0c47175dafa00fbc1e3017a63147650fa343abda5b"} Apr 28 19:22:47.510899 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:47.510900 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6tn7v" event={"ID":"0fe3fc8e-4148-4958-9bc0-c0b1fb3fa5fe","Type":"ContainerStarted","Data":"0615a726983c218513bdba3037361321b67651efa9df45fc8f75fa1079172d02"} Apr 28 19:22:47.527362 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:47.527314 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6tn7v" podStartSLOduration=1.247551371 podStartE2EDuration="1.527297719s" podCreationTimestamp="2026-04-28 19:22:46 +0000 UTC" firstStartedPulling="2026-04-28 19:22:47.001580292 +0000 UTC m=+393.297935634" lastFinishedPulling="2026-04-28 19:22:47.281326635 +0000 UTC m=+393.577681982" observedRunningTime="2026-04-28 19:22:47.525683717 +0000 UTC m=+393.822039081" watchObservedRunningTime="2026-04-28 19:22:47.527297719 +0000 UTC m=+393.823653083" Apr 28 19:22:48.270751 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:48.270718 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cc19523-7920-4384-91e1-a7f9992d0436" path="/var/lib/kubelet/pods/9cc19523-7920-4384-91e1-a7f9992d0436/volumes" Apr 28 19:22:55.920070 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:55.920027 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-fk69z"] Apr 28 19:22:55.923439 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:55.923417 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-fk69z" Apr 28 19:22:55.925497 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:55.925479 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 28 19:22:55.925575 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:55.925477 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 28 19:22:55.930677 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:55.930653 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-fk69z"] Apr 28 19:22:56.014104 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:56.014065 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/d08931bf-78f5-41dc-906f-62525abfa8ce-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-fk69z\" (UID: \"d08931bf-78f5-41dc-906f-62525abfa8ce\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fk69z" Apr 28 19:22:56.014104 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:56.014109 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j7xw\" (UniqueName: \"kubernetes.io/projected/d08931bf-78f5-41dc-906f-62525abfa8ce-kube-api-access-8j7xw\") pod \"seaweedfs-tls-serving-7fd5766db9-fk69z\" (UID: \"d08931bf-78f5-41dc-906f-62525abfa8ce\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fk69z" Apr 28 19:22:56.014317 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:56.014131 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d08931bf-78f5-41dc-906f-62525abfa8ce-data\") pod \"seaweedfs-tls-serving-7fd5766db9-fk69z\" (UID: \"d08931bf-78f5-41dc-906f-62525abfa8ce\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fk69z" Apr 28 19:22:56.114570 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:56.114537 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/d08931bf-78f5-41dc-906f-62525abfa8ce-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-fk69z\" (UID: \"d08931bf-78f5-41dc-906f-62525abfa8ce\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fk69z" Apr 28 19:22:56.114739 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:56.114578 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8j7xw\" (UniqueName: \"kubernetes.io/projected/d08931bf-78f5-41dc-906f-62525abfa8ce-kube-api-access-8j7xw\") pod \"seaweedfs-tls-serving-7fd5766db9-fk69z\" (UID: \"d08931bf-78f5-41dc-906f-62525abfa8ce\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fk69z" Apr 28 19:22:56.114739 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:56.114606 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d08931bf-78f5-41dc-906f-62525abfa8ce-data\") pod \"seaweedfs-tls-serving-7fd5766db9-fk69z\" (UID: \"d08931bf-78f5-41dc-906f-62525abfa8ce\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fk69z" Apr 28 19:22:56.114739 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:22:56.114703 2539 projected.go:264] Couldn't get secret kserve/seaweedfs-tls-serving: secret "seaweedfs-tls-serving" not found Apr 28 19:22:56.114739 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:22:56.114728 2539 projected.go:194] Error preparing data for projected volume seaweedfs-tls-serving for pod kserve/seaweedfs-tls-serving-7fd5766db9-fk69z: secret "seaweedfs-tls-serving" not found Apr 28 19:22:56.114930 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:22:56.114798 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d08931bf-78f5-41dc-906f-62525abfa8ce-seaweedfs-tls-serving podName:d08931bf-78f5-41dc-906f-62525abfa8ce nodeName:}" failed. No retries permitted until 2026-04-28 19:22:56.614776492 +0000 UTC m=+402.911131846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "seaweedfs-tls-serving" (UniqueName: "kubernetes.io/projected/d08931bf-78f5-41dc-906f-62525abfa8ce-seaweedfs-tls-serving") pod "seaweedfs-tls-serving-7fd5766db9-fk69z" (UID: "d08931bf-78f5-41dc-906f-62525abfa8ce") : secret "seaweedfs-tls-serving" not found Apr 28 19:22:56.114991 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:56.114961 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d08931bf-78f5-41dc-906f-62525abfa8ce-data\") pod \"seaweedfs-tls-serving-7fd5766db9-fk69z\" (UID: \"d08931bf-78f5-41dc-906f-62525abfa8ce\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fk69z" Apr 28 19:22:56.125649 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:56.125621 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j7xw\" (UniqueName: \"kubernetes.io/projected/d08931bf-78f5-41dc-906f-62525abfa8ce-kube-api-access-8j7xw\") pod \"seaweedfs-tls-serving-7fd5766db9-fk69z\" (UID: \"d08931bf-78f5-41dc-906f-62525abfa8ce\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fk69z" Apr 28 19:22:56.619661 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:56.619626 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/d08931bf-78f5-41dc-906f-62525abfa8ce-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-fk69z\" (UID: \"d08931bf-78f5-41dc-906f-62525abfa8ce\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fk69z" Apr 28 19:22:56.621912 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:56.621885 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/d08931bf-78f5-41dc-906f-62525abfa8ce-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-fk69z\" (UID: \"d08931bf-78f5-41dc-906f-62525abfa8ce\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fk69z" Apr 28 19:22:56.832850 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:56.832800 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-fk69z" Apr 28 19:22:56.948747 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:56.948617 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-fk69z"] Apr 28 19:22:56.951505 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:22:56.951475 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd08931bf_78f5_41dc_906f_62525abfa8ce.slice/crio-440021e9fe94498d3c60009d2e0ef86edecc89b972801a2071745c93e49cf816 WatchSource:0}: Error finding container 440021e9fe94498d3c60009d2e0ef86edecc89b972801a2071745c93e49cf816: Status 404 returned error can't find the container with id 440021e9fe94498d3c60009d2e0ef86edecc89b972801a2071745c93e49cf816 Apr 28 19:22:57.538179 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:57.538139 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-fk69z" event={"ID":"d08931bf-78f5-41dc-906f-62525abfa8ce","Type":"ContainerStarted","Data":"2cb4fb611f9a2327ad18065e9d96cd668e7a9aed32ab5073d69317dbcdc5e7ff"} Apr 28 19:22:57.538179 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:57.538182 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-fk69z" event={"ID":"d08931bf-78f5-41dc-906f-62525abfa8ce","Type":"ContainerStarted","Data":"440021e9fe94498d3c60009d2e0ef86edecc89b972801a2071745c93e49cf816"} Apr 28 19:22:57.555534 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:57.555489 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-fk69z" podStartSLOduration=2.308786997 podStartE2EDuration="2.555473476s" podCreationTimestamp="2026-04-28 19:22:55 +0000 UTC" firstStartedPulling="2026-04-28 19:22:56.952575471 +0000 UTC m=+403.248930813" lastFinishedPulling="2026-04-28 19:22:57.199261946 +0000 UTC m=+403.495617292" observedRunningTime="2026-04-28 19:22:57.555169474 +0000 UTC m=+403.851524838" watchObservedRunningTime="2026-04-28 19:22:57.555473476 +0000 UTC m=+403.851828842" Apr 28 19:22:58.096512 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:58.096475 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-5t9jq"] Apr 28 19:22:58.101059 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:58.101034 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-5t9jq" Apr 28 19:22:58.111628 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:58.111600 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-5t9jq"] Apr 28 19:22:58.133625 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:58.133598 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbk6n\" (UniqueName: \"kubernetes.io/projected/71704a35-fdd3-4b64-a758-2b24c286270e-kube-api-access-pbk6n\") pod \"s3-tls-init-serving-5t9jq\" (UID: \"71704a35-fdd3-4b64-a758-2b24c286270e\") " pod="kserve/s3-tls-init-serving-5t9jq" Apr 28 19:22:58.235026 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:58.234972 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbk6n\" (UniqueName: \"kubernetes.io/projected/71704a35-fdd3-4b64-a758-2b24c286270e-kube-api-access-pbk6n\") pod \"s3-tls-init-serving-5t9jq\" (UID: \"71704a35-fdd3-4b64-a758-2b24c286270e\") " pod="kserve/s3-tls-init-serving-5t9jq" Apr 28 19:22:58.243467 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:58.243438 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbk6n\" (UniqueName: \"kubernetes.io/projected/71704a35-fdd3-4b64-a758-2b24c286270e-kube-api-access-pbk6n\") pod \"s3-tls-init-serving-5t9jq\" (UID: \"71704a35-fdd3-4b64-a758-2b24c286270e\") " pod="kserve/s3-tls-init-serving-5t9jq" Apr 28 19:22:58.417850 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:58.417757 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-5t9jq" Apr 28 19:22:58.538468 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:58.538412 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-5t9jq"] Apr 28 19:22:58.541231 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:22:58.541204 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71704a35_fdd3_4b64_a758_2b24c286270e.slice/crio-2b43df5a63493cf8660cb052780c498557af45ac331e72ce3f03ef2905600c4d WatchSource:0}: Error finding container 2b43df5a63493cf8660cb052780c498557af45ac331e72ce3f03ef2905600c4d: Status 404 returned error can't find the container with id 2b43df5a63493cf8660cb052780c498557af45ac331e72ce3f03ef2905600c4d Apr 28 19:22:59.545632 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:22:59.545592 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-5t9jq" event={"ID":"71704a35-fdd3-4b64-a758-2b24c286270e","Type":"ContainerStarted","Data":"2b43df5a63493cf8660cb052780c498557af45ac331e72ce3f03ef2905600c4d"} Apr 28 19:23:03.558047 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:23:03.558008 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-5t9jq" event={"ID":"71704a35-fdd3-4b64-a758-2b24c286270e","Type":"ContainerStarted","Data":"2c861d317dc2992e47e474fa84c8a511ac06cdb8dfe6b43702a8c3cbea2ade03"} Apr 28 19:23:03.575310 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:23:03.575240 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-5t9jq" podStartSLOduration=1.064654672 podStartE2EDuration="5.57522037s" podCreationTimestamp="2026-04-28 19:22:58 +0000 UTC" firstStartedPulling="2026-04-28 19:22:58.542863076 +0000 UTC m=+404.839218421" lastFinishedPulling="2026-04-28 19:23:03.053428765 +0000 UTC m=+409.349784119" observedRunningTime="2026-04-28 19:23:03.573277449 +0000 UTC m=+409.869632826" watchObservedRunningTime="2026-04-28 19:23:03.57522037 +0000 UTC m=+409.871575736" Apr 28 19:23:07.569649 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:23:07.569615 2539 generic.go:358] "Generic (PLEG): container finished" podID="71704a35-fdd3-4b64-a758-2b24c286270e" containerID="2c861d317dc2992e47e474fa84c8a511ac06cdb8dfe6b43702a8c3cbea2ade03" exitCode=0 Apr 28 19:23:07.570047 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:23:07.569692 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-5t9jq" event={"ID":"71704a35-fdd3-4b64-a758-2b24c286270e","Type":"ContainerDied","Data":"2c861d317dc2992e47e474fa84c8a511ac06cdb8dfe6b43702a8c3cbea2ade03"} Apr 28 19:23:08.715348 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:23:08.715325 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-5t9jq" Apr 28 19:23:08.826075 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:23:08.825969 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbk6n\" (UniqueName: \"kubernetes.io/projected/71704a35-fdd3-4b64-a758-2b24c286270e-kube-api-access-pbk6n\") pod \"71704a35-fdd3-4b64-a758-2b24c286270e\" (UID: \"71704a35-fdd3-4b64-a758-2b24c286270e\") " Apr 28 19:23:08.827992 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:23:08.827971 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71704a35-fdd3-4b64-a758-2b24c286270e-kube-api-access-pbk6n" (OuterVolumeSpecName: "kube-api-access-pbk6n") pod "71704a35-fdd3-4b64-a758-2b24c286270e" (UID: "71704a35-fdd3-4b64-a758-2b24c286270e"). InnerVolumeSpecName "kube-api-access-pbk6n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:23:08.927263 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:23:08.927225 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pbk6n\" (UniqueName: \"kubernetes.io/projected/71704a35-fdd3-4b64-a758-2b24c286270e-kube-api-access-pbk6n\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:23:09.576620 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:23:09.576584 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-5t9jq" event={"ID":"71704a35-fdd3-4b64-a758-2b24c286270e","Type":"ContainerDied","Data":"2b43df5a63493cf8660cb052780c498557af45ac331e72ce3f03ef2905600c4d"} Apr 28 19:23:09.576620 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:23:09.576616 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-5t9jq" Apr 28 19:23:09.576843 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:23:09.576622 2539 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b43df5a63493cf8660cb052780c498557af45ac331e72ce3f03ef2905600c4d" Apr 28 19:26:14.200754 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:26:14.200727 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 19:26:14.201279 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:26:14.200966 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 19:31:14.221504 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:31:14.221423 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 19:31:14.222061 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:31:14.221651 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 19:36:14.240538 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:36:14.240500 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 19:36:14.241331 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:36:14.241309 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 19:41:14.266201 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:41:14.266170 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 19:41:14.267453 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:41:14.267429 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 19:42:16.634025 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.633991 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw"] Apr 28 19:42:16.634758 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.634470 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71704a35-fdd3-4b64-a758-2b24c286270e" containerName="s3-tls-init-serving" Apr 28 19:42:16.634758 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.634488 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="71704a35-fdd3-4b64-a758-2b24c286270e" containerName="s3-tls-init-serving" Apr 28 19:42:16.634758 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.634568 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="71704a35-fdd3-4b64-a758-2b24c286270e" containerName="s3-tls-init-serving" Apr 28 19:42:16.638071 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.638051 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" Apr 28 19:42:16.640066 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.640047 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-predictor-serving-cert\"" Apr 28 19:42:16.640066 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.640062 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 28 19:42:16.640657 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.640634 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-44jch\"" Apr 28 19:42:16.640657 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.640645 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-kube-rbac-proxy-sar-config\"" Apr 28 19:42:16.640657 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.640638 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 19:42:16.648143 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.648116 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-2zztw\" (UID: \"f1725c5e-89b3-42d7-aa6e-1bef5c37035f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" Apr 28 19:42:16.648259 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.648148 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-2zztw\" (UID: \"f1725c5e-89b3-42d7-aa6e-1bef5c37035f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" Apr 28 19:42:16.648259 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.648177 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-2zztw\" (UID: \"f1725c5e-89b3-42d7-aa6e-1bef5c37035f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" Apr 28 19:42:16.648397 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.648258 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6mxm\" (UniqueName: \"kubernetes.io/projected/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-kube-api-access-d6mxm\") pod \"isvc-pmml-runtime-predictor-67bc544947-2zztw\" (UID: \"f1725c5e-89b3-42d7-aa6e-1bef5c37035f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" Apr 28 19:42:16.648471 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.648450 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw"] Apr 28 19:42:16.749092 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.749050 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-2zztw\" (UID: \"f1725c5e-89b3-42d7-aa6e-1bef5c37035f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" Apr 28 19:42:16.749266 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.749109 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-2zztw\" (UID: \"f1725c5e-89b3-42d7-aa6e-1bef5c37035f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" Apr 28 19:42:16.749266 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.749161 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-2zztw\" (UID: \"f1725c5e-89b3-42d7-aa6e-1bef5c37035f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" Apr 28 19:42:16.749266 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.749201 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6mxm\" (UniqueName: \"kubernetes.io/projected/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-kube-api-access-d6mxm\") pod \"isvc-pmml-runtime-predictor-67bc544947-2zztw\" (UID: \"f1725c5e-89b3-42d7-aa6e-1bef5c37035f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" Apr 28 19:42:16.749509 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.749482 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-2zztw\" (UID: \"f1725c5e-89b3-42d7-aa6e-1bef5c37035f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" Apr 28 19:42:16.749868 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.749836 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-2zztw\" (UID: \"f1725c5e-89b3-42d7-aa6e-1bef5c37035f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" Apr 28 19:42:16.751571 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.751551 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-2zztw\" (UID: \"f1725c5e-89b3-42d7-aa6e-1bef5c37035f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" Apr 28 19:42:16.756825 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.756802 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6mxm\" (UniqueName: \"kubernetes.io/projected/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-kube-api-access-d6mxm\") pod \"isvc-pmml-runtime-predictor-67bc544947-2zztw\" (UID: \"f1725c5e-89b3-42d7-aa6e-1bef5c37035f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" Apr 28 19:42:16.948318 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:16.948236 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" Apr 28 19:42:17.065022 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:17.064986 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw"] Apr 28 19:42:17.068089 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:42:17.068040 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1725c5e_89b3_42d7_aa6e_1bef5c37035f.slice/crio-d09d7b8c44ae56279a5da6e1bb5b671a781cac00de8684124e5fe3c44ff9dc2a WatchSource:0}: Error finding container d09d7b8c44ae56279a5da6e1bb5b671a781cac00de8684124e5fe3c44ff9dc2a: Status 404 returned error can't find the container with id d09d7b8c44ae56279a5da6e1bb5b671a781cac00de8684124e5fe3c44ff9dc2a Apr 28 19:42:17.069931 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:17.069914 2539 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:42:17.700169 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:17.700134 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" event={"ID":"f1725c5e-89b3-42d7-aa6e-1bef5c37035f","Type":"ContainerStarted","Data":"d09d7b8c44ae56279a5da6e1bb5b671a781cac00de8684124e5fe3c44ff9dc2a"} Apr 28 19:42:22.717596 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:22.717561 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" event={"ID":"f1725c5e-89b3-42d7-aa6e-1bef5c37035f","Type":"ContainerStarted","Data":"b7c58eaf95cfed6ac6c2e301ba266f80c429421742539f13b1367da822a60a49"} Apr 28 19:42:26.728978 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:26.728936 2539 generic.go:358] "Generic (PLEG): container finished" podID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerID="b7c58eaf95cfed6ac6c2e301ba266f80c429421742539f13b1367da822a60a49" exitCode=0 Apr 28 19:42:26.729344 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:26.729008 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" event={"ID":"f1725c5e-89b3-42d7-aa6e-1bef5c37035f","Type":"ContainerDied","Data":"b7c58eaf95cfed6ac6c2e301ba266f80c429421742539f13b1367da822a60a49"} Apr 28 19:42:34.758648 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:34.758603 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" event={"ID":"f1725c5e-89b3-42d7-aa6e-1bef5c37035f","Type":"ContainerStarted","Data":"404a539533294982aea538e5fef55d0375654319c0eaea551b0b5e9b184bf893"} Apr 28 19:42:36.765976 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:36.765939 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" event={"ID":"f1725c5e-89b3-42d7-aa6e-1bef5c37035f","Type":"ContainerStarted","Data":"6847d3fc36195db4b15f2db8c47d673331624d449d4e3b9a12753b3468609346"} Apr 28 19:42:36.766341 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:36.766125 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" Apr 28 19:42:36.784778 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:36.784726 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" podStartSLOduration=1.543964764 podStartE2EDuration="20.784711015s" podCreationTimestamp="2026-04-28 19:42:16 +0000 UTC" firstStartedPulling="2026-04-28 19:42:17.070050736 +0000 UTC m=+1563.366406078" lastFinishedPulling="2026-04-28 19:42:36.310796982 +0000 UTC m=+1582.607152329" observedRunningTime="2026-04-28 19:42:36.782559025 +0000 UTC m=+1583.078914399" watchObservedRunningTime="2026-04-28 19:42:36.784711015 +0000 UTC m=+1583.081066379" Apr 28 19:42:37.769141 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:37.769109 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" Apr 28 19:42:37.770499 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:37.770455 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" podUID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 28 19:42:38.772087 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:38.772043 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" podUID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 28 19:42:43.776724 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:43.776684 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" Apr 28 19:42:43.777255 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:43.777229 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" podUID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 28 19:42:53.777349 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:42:53.777309 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" podUID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 28 19:43:03.777716 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:43:03.777631 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" podUID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 28 19:43:13.777537 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:43:13.777497 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" podUID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 28 19:43:23.777271 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:43:23.777225 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" podUID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 28 19:43:33.777423 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:43:33.777354 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" podUID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 28 19:43:43.777429 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:43:43.777368 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" podUID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 28 19:43:53.777973 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:43:53.777941 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" Apr 28 19:44:01.964624 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:01.964591 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw"] Apr 28 19:44:01.965048 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:01.965028 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" podUID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerName="kserve-container" containerID="cri-o://404a539533294982aea538e5fef55d0375654319c0eaea551b0b5e9b184bf893" gracePeriod=30 Apr 28 19:44:01.965130 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:01.965070 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" podUID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerName="kube-rbac-proxy" containerID="cri-o://6847d3fc36195db4b15f2db8c47d673331624d449d4e3b9a12753b3468609346" gracePeriod=30 Apr 28 19:44:02.287777 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:02.287744 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9"] Apr 28 19:44:02.291223 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:02.291208 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" Apr 28 19:44:02.293192 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:02.293170 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-predictor-serving-cert\"" Apr 28 19:44:02.293192 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:02.293186 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 28 19:44:02.300096 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:02.300072 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9"] Apr 28 19:44:02.364662 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:02.364629 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad0d63d6-1cda-4932-ada2-21f5e7494017-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9\" (UID: \"ad0d63d6-1cda-4932-ada2-21f5e7494017\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" Apr 28 19:44:02.364829 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:02.364668 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad0d63d6-1cda-4932-ada2-21f5e7494017-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9\" (UID: \"ad0d63d6-1cda-4932-ada2-21f5e7494017\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" Apr 28 19:44:02.364829 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:02.364693 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad0d63d6-1cda-4932-ada2-21f5e7494017-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9\" (UID: \"ad0d63d6-1cda-4932-ada2-21f5e7494017\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" Apr 28 19:44:02.364829 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:02.364781 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv7wt\" (UniqueName: \"kubernetes.io/projected/ad0d63d6-1cda-4932-ada2-21f5e7494017-kube-api-access-kv7wt\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9\" (UID: \"ad0d63d6-1cda-4932-ada2-21f5e7494017\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" Apr 28 19:44:02.465492 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:02.465456 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kv7wt\" (UniqueName: \"kubernetes.io/projected/ad0d63d6-1cda-4932-ada2-21f5e7494017-kube-api-access-kv7wt\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9\" (UID: \"ad0d63d6-1cda-4932-ada2-21f5e7494017\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" Apr 28 19:44:02.465672 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:02.465564 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad0d63d6-1cda-4932-ada2-21f5e7494017-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9\" (UID: \"ad0d63d6-1cda-4932-ada2-21f5e7494017\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" Apr 28 19:44:02.465672 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:02.465595 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad0d63d6-1cda-4932-ada2-21f5e7494017-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9\" (UID: \"ad0d63d6-1cda-4932-ada2-21f5e7494017\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" Apr 28 19:44:02.465672 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:02.465629 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad0d63d6-1cda-4932-ada2-21f5e7494017-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9\" (UID: \"ad0d63d6-1cda-4932-ada2-21f5e7494017\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" Apr 28 19:44:02.466024 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:02.466000 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad0d63d6-1cda-4932-ada2-21f5e7494017-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9\" (UID: \"ad0d63d6-1cda-4932-ada2-21f5e7494017\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" Apr 28 19:44:02.466316 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:02.466296 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad0d63d6-1cda-4932-ada2-21f5e7494017-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9\" (UID: \"ad0d63d6-1cda-4932-ada2-21f5e7494017\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" Apr 28 19:44:02.467828 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:02.467810 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad0d63d6-1cda-4932-ada2-21f5e7494017-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9\" (UID: \"ad0d63d6-1cda-4932-ada2-21f5e7494017\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" Apr 28 19:44:02.473306 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:02.473271 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv7wt\" (UniqueName: \"kubernetes.io/projected/ad0d63d6-1cda-4932-ada2-21f5e7494017-kube-api-access-kv7wt\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9\" (UID: \"ad0d63d6-1cda-4932-ada2-21f5e7494017\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" Apr 28 19:44:02.602119 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:02.602028 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" Apr 28 19:44:02.726794 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:02.726771 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9"] Apr 28 19:44:02.729059 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:44:02.729032 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad0d63d6_1cda_4932_ada2_21f5e7494017.slice/crio-8224a1a18a8a160b65043e5e918a6636c6a422de282f452123ffa96660e4dce7 WatchSource:0}: Error finding container 8224a1a18a8a160b65043e5e918a6636c6a422de282f452123ffa96660e4dce7: Status 404 returned error can't find the container with id 8224a1a18a8a160b65043e5e918a6636c6a422de282f452123ffa96660e4dce7 Apr 28 19:44:03.012831 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:03.012792 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" event={"ID":"ad0d63d6-1cda-4932-ada2-21f5e7494017","Type":"ContainerStarted","Data":"1a8332da7b4a5ef10c9d8639a8ea9effdeeb5c0f26e060ef666cf326558b52d4"} Apr 28 19:44:03.012831 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:03.012834 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" event={"ID":"ad0d63d6-1cda-4932-ada2-21f5e7494017","Type":"ContainerStarted","Data":"8224a1a18a8a160b65043e5e918a6636c6a422de282f452123ffa96660e4dce7"} Apr 28 19:44:03.015104 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:03.015080 2539 generic.go:358] "Generic (PLEG): container finished" podID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerID="6847d3fc36195db4b15f2db8c47d673331624d449d4e3b9a12753b3468609346" exitCode=2 Apr 28 19:44:03.015222 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:03.015150 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" event={"ID":"f1725c5e-89b3-42d7-aa6e-1bef5c37035f","Type":"ContainerDied","Data":"6847d3fc36195db4b15f2db8c47d673331624d449d4e3b9a12753b3468609346"} Apr 28 19:44:03.772732 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:03.772682 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" podUID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.23:8643/healthz\": dial tcp 10.133.0.23:8643: connect: connection refused" Apr 28 19:44:03.778240 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:03.778213 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" podUID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 28 19:44:05.711421 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:05.711392 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" Apr 28 19:44:05.796807 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:05.796762 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-proxy-tls\") pod \"f1725c5e-89b3-42d7-aa6e-1bef5c37035f\" (UID: \"f1725c5e-89b3-42d7-aa6e-1bef5c37035f\") " Apr 28 19:44:05.796807 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:05.796812 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"f1725c5e-89b3-42d7-aa6e-1bef5c37035f\" (UID: \"f1725c5e-89b3-42d7-aa6e-1bef5c37035f\") " Apr 28 19:44:05.797085 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:05.796850 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-kserve-provision-location\") pod \"f1725c5e-89b3-42d7-aa6e-1bef5c37035f\" (UID: \"f1725c5e-89b3-42d7-aa6e-1bef5c37035f\") " Apr 28 19:44:05.797085 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:05.796903 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6mxm\" (UniqueName: \"kubernetes.io/projected/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-kube-api-access-d6mxm\") pod \"f1725c5e-89b3-42d7-aa6e-1bef5c37035f\" (UID: \"f1725c5e-89b3-42d7-aa6e-1bef5c37035f\") " Apr 28 19:44:05.797231 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:05.797205 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f1725c5e-89b3-42d7-aa6e-1bef5c37035f" (UID: "f1725c5e-89b3-42d7-aa6e-1bef5c37035f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:44:05.797288 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:05.797245 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-isvc-pmml-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-runtime-kube-rbac-proxy-sar-config") pod "f1725c5e-89b3-42d7-aa6e-1bef5c37035f" (UID: "f1725c5e-89b3-42d7-aa6e-1bef5c37035f"). InnerVolumeSpecName "isvc-pmml-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:44:05.799044 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:05.799021 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f1725c5e-89b3-42d7-aa6e-1bef5c37035f" (UID: "f1725c5e-89b3-42d7-aa6e-1bef5c37035f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:44:05.799177 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:05.799153 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-kube-api-access-d6mxm" (OuterVolumeSpecName: "kube-api-access-d6mxm") pod "f1725c5e-89b3-42d7-aa6e-1bef5c37035f" (UID: "f1725c5e-89b3-42d7-aa6e-1bef5c37035f"). InnerVolumeSpecName "kube-api-access-d6mxm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:44:05.897696 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:05.897643 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d6mxm\" (UniqueName: \"kubernetes.io/projected/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-kube-api-access-d6mxm\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:44:05.897696 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:05.897688 2539 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-proxy-tls\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:44:05.897696 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:05.897701 2539 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:44:05.897696 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:05.897711 2539 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1725c5e-89b3-42d7-aa6e-1bef5c37035f-kserve-provision-location\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:44:06.024886 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:06.024850 2539 generic.go:358] "Generic (PLEG): container finished" podID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerID="404a539533294982aea538e5fef55d0375654319c0eaea551b0b5e9b184bf893" exitCode=0 Apr 28 19:44:06.025053 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:06.024934 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" event={"ID":"f1725c5e-89b3-42d7-aa6e-1bef5c37035f","Type":"ContainerDied","Data":"404a539533294982aea538e5fef55d0375654319c0eaea551b0b5e9b184bf893"} Apr 28 19:44:06.025053 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:06.024960 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" event={"ID":"f1725c5e-89b3-42d7-aa6e-1bef5c37035f","Type":"ContainerDied","Data":"d09d7b8c44ae56279a5da6e1bb5b671a781cac00de8684124e5fe3c44ff9dc2a"} Apr 28 19:44:06.025053 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:06.024974 2539 scope.go:117] "RemoveContainer" containerID="6847d3fc36195db4b15f2db8c47d673331624d449d4e3b9a12753b3468609346" Apr 28 19:44:06.025053 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:06.024937 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw" Apr 28 19:44:06.033332 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:06.033219 2539 scope.go:117] "RemoveContainer" containerID="404a539533294982aea538e5fef55d0375654319c0eaea551b0b5e9b184bf893" Apr 28 19:44:06.040625 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:06.040605 2539 scope.go:117] "RemoveContainer" containerID="b7c58eaf95cfed6ac6c2e301ba266f80c429421742539f13b1367da822a60a49" Apr 28 19:44:06.046967 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:06.046914 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw"] Apr 28 19:44:06.048449 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:06.048432 2539 scope.go:117] "RemoveContainer" containerID="6847d3fc36195db4b15f2db8c47d673331624d449d4e3b9a12753b3468609346" Apr 28 19:44:06.048783 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:44:06.048748 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6847d3fc36195db4b15f2db8c47d673331624d449d4e3b9a12753b3468609346\": container with ID starting with 6847d3fc36195db4b15f2db8c47d673331624d449d4e3b9a12753b3468609346 not found: ID does not exist" containerID="6847d3fc36195db4b15f2db8c47d673331624d449d4e3b9a12753b3468609346" Apr 28 19:44:06.048878 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:06.048787 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6847d3fc36195db4b15f2db8c47d673331624d449d4e3b9a12753b3468609346"} err="failed to get container status \"6847d3fc36195db4b15f2db8c47d673331624d449d4e3b9a12753b3468609346\": rpc error: code = NotFound desc = could not find container \"6847d3fc36195db4b15f2db8c47d673331624d449d4e3b9a12753b3468609346\": container with ID starting with 6847d3fc36195db4b15f2db8c47d673331624d449d4e3b9a12753b3468609346 not found: ID does not exist" Apr 28 19:44:06.048878 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:06.048815 2539 scope.go:117] "RemoveContainer" containerID="404a539533294982aea538e5fef55d0375654319c0eaea551b0b5e9b184bf893" Apr 28 19:44:06.049084 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:44:06.049066 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"404a539533294982aea538e5fef55d0375654319c0eaea551b0b5e9b184bf893\": container with ID starting with 404a539533294982aea538e5fef55d0375654319c0eaea551b0b5e9b184bf893 not found: ID does not exist" containerID="404a539533294982aea538e5fef55d0375654319c0eaea551b0b5e9b184bf893" Apr 28 19:44:06.049140 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:06.049089 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"404a539533294982aea538e5fef55d0375654319c0eaea551b0b5e9b184bf893"} err="failed to get container status \"404a539533294982aea538e5fef55d0375654319c0eaea551b0b5e9b184bf893\": rpc error: code = NotFound desc = could not find container \"404a539533294982aea538e5fef55d0375654319c0eaea551b0b5e9b184bf893\": container with ID starting with 404a539533294982aea538e5fef55d0375654319c0eaea551b0b5e9b184bf893 not found: ID does not exist" Apr 28 19:44:06.049140 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:06.049105 2539 scope.go:117] "RemoveContainer" containerID="b7c58eaf95cfed6ac6c2e301ba266f80c429421742539f13b1367da822a60a49" Apr 28 19:44:06.049306 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:44:06.049288 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7c58eaf95cfed6ac6c2e301ba266f80c429421742539f13b1367da822a60a49\": container with ID starting with b7c58eaf95cfed6ac6c2e301ba266f80c429421742539f13b1367da822a60a49 not found: ID does not exist" containerID="b7c58eaf95cfed6ac6c2e301ba266f80c429421742539f13b1367da822a60a49" Apr 28 19:44:06.049343 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:06.049310 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7c58eaf95cfed6ac6c2e301ba266f80c429421742539f13b1367da822a60a49"} err="failed to get container status \"b7c58eaf95cfed6ac6c2e301ba266f80c429421742539f13b1367da822a60a49\": rpc error: code = NotFound desc = could not find container \"b7c58eaf95cfed6ac6c2e301ba266f80c429421742539f13b1367da822a60a49\": container with ID starting with b7c58eaf95cfed6ac6c2e301ba266f80c429421742539f13b1367da822a60a49 not found: ID does not exist" Apr 28 19:44:06.052323 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:06.052301 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-2zztw"] Apr 28 19:44:06.271846 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:06.271806 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" path="/var/lib/kubelet/pods/f1725c5e-89b3-42d7-aa6e-1bef5c37035f/volumes" Apr 28 19:44:07.029052 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:07.029022 2539 generic.go:358] "Generic (PLEG): container finished" podID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerID="1a8332da7b4a5ef10c9d8639a8ea9effdeeb5c0f26e060ef666cf326558b52d4" exitCode=0 Apr 28 19:44:07.029561 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:07.029099 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" event={"ID":"ad0d63d6-1cda-4932-ada2-21f5e7494017","Type":"ContainerDied","Data":"1a8332da7b4a5ef10c9d8639a8ea9effdeeb5c0f26e060ef666cf326558b52d4"} Apr 28 19:44:08.034881 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:08.034845 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" event={"ID":"ad0d63d6-1cda-4932-ada2-21f5e7494017","Type":"ContainerStarted","Data":"ac1addc47d7799f06820933004eca4310a3352437e57393a440380560def398c"} Apr 28 19:44:08.035243 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:08.034894 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" event={"ID":"ad0d63d6-1cda-4932-ada2-21f5e7494017","Type":"ContainerStarted","Data":"10c311c38b8cd63b949d51930ceb2076c32cc2da75615f7828c11604e8d4d81d"} Apr 28 19:44:08.035243 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:08.035126 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" Apr 28 19:44:08.035243 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:08.035154 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" Apr 28 19:44:08.036422 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:08.036395 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" podUID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 28 19:44:08.054815 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:08.054767 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" podStartSLOduration=6.054752028 podStartE2EDuration="6.054752028s" podCreationTimestamp="2026-04-28 19:44:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:44:08.05318168 +0000 UTC m=+1674.349537050" watchObservedRunningTime="2026-04-28 19:44:08.054752028 +0000 UTC m=+1674.351107391" Apr 28 19:44:09.042236 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:09.042110 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" podUID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 28 19:44:14.044025 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:14.043996 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" Apr 28 19:44:14.044556 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:14.044532 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" podUID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 28 19:44:24.044937 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:24.044889 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" podUID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 28 19:44:34.044687 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:34.044645 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" podUID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 28 19:44:44.045434 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:44.045392 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" podUID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 28 19:44:54.044477 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:44:54.044436 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" podUID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 28 19:45:04.044945 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:04.044897 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" podUID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 28 19:45:14.045140 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:14.045091 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" podUID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 28 19:45:24.045061 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:24.045015 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" podUID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 28 19:45:27.268039 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:27.268011 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" Apr 28 19:45:33.269835 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:33.269803 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9"] Apr 28 19:45:33.270280 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:33.270138 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" podUID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerName="kserve-container" containerID="cri-o://10c311c38b8cd63b949d51930ceb2076c32cc2da75615f7828c11604e8d4d81d" gracePeriod=30 Apr 28 19:45:33.270280 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:33.270182 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" podUID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerName="kube-rbac-proxy" containerID="cri-o://ac1addc47d7799f06820933004eca4310a3352437e57393a440380560def398c" gracePeriod=30 Apr 28 19:45:34.040420 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:34.040351 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" podUID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.24:8643/healthz\": dial tcp 10.133.0.24:8643: connect: connection refused" Apr 28 19:45:34.283514 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:34.283481 2539 generic.go:358] "Generic (PLEG): container finished" podID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerID="ac1addc47d7799f06820933004eca4310a3352437e57393a440380560def398c" exitCode=2 Apr 28 19:45:34.283899 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:34.283558 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" event={"ID":"ad0d63d6-1cda-4932-ada2-21f5e7494017","Type":"ContainerDied","Data":"ac1addc47d7799f06820933004eca4310a3352437e57393a440380560def398c"} Apr 28 19:45:37.013080 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.013057 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" Apr 28 19:45:37.147810 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.147709 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad0d63d6-1cda-4932-ada2-21f5e7494017-proxy-tls\") pod \"ad0d63d6-1cda-4932-ada2-21f5e7494017\" (UID: \"ad0d63d6-1cda-4932-ada2-21f5e7494017\") " Apr 28 19:45:37.147810 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.147755 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad0d63d6-1cda-4932-ada2-21f5e7494017-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"ad0d63d6-1cda-4932-ada2-21f5e7494017\" (UID: \"ad0d63d6-1cda-4932-ada2-21f5e7494017\") " Apr 28 19:45:37.147810 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.147782 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv7wt\" (UniqueName: \"kubernetes.io/projected/ad0d63d6-1cda-4932-ada2-21f5e7494017-kube-api-access-kv7wt\") pod \"ad0d63d6-1cda-4932-ada2-21f5e7494017\" (UID: \"ad0d63d6-1cda-4932-ada2-21f5e7494017\") " Apr 28 19:45:37.148130 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.147822 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad0d63d6-1cda-4932-ada2-21f5e7494017-kserve-provision-location\") pod \"ad0d63d6-1cda-4932-ada2-21f5e7494017\" (UID: \"ad0d63d6-1cda-4932-ada2-21f5e7494017\") " Apr 28 19:45:37.148183 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.148150 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad0d63d6-1cda-4932-ada2-21f5e7494017-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ad0d63d6-1cda-4932-ada2-21f5e7494017" (UID: "ad0d63d6-1cda-4932-ada2-21f5e7494017"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:45:37.148253 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.148225 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad0d63d6-1cda-4932-ada2-21f5e7494017-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config") pod "ad0d63d6-1cda-4932-ada2-21f5e7494017" (UID: "ad0d63d6-1cda-4932-ada2-21f5e7494017"). InnerVolumeSpecName "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:45:37.149895 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.149871 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad0d63d6-1cda-4932-ada2-21f5e7494017-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ad0d63d6-1cda-4932-ada2-21f5e7494017" (UID: "ad0d63d6-1cda-4932-ada2-21f5e7494017"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:45:37.150005 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.149899 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad0d63d6-1cda-4932-ada2-21f5e7494017-kube-api-access-kv7wt" (OuterVolumeSpecName: "kube-api-access-kv7wt") pod "ad0d63d6-1cda-4932-ada2-21f5e7494017" (UID: "ad0d63d6-1cda-4932-ada2-21f5e7494017"). InnerVolumeSpecName "kube-api-access-kv7wt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:45:37.248618 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.248591 2539 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad0d63d6-1cda-4932-ada2-21f5e7494017-proxy-tls\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:45:37.248618 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.248615 2539 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad0d63d6-1cda-4932-ada2-21f5e7494017-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:45:37.248618 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.248626 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kv7wt\" (UniqueName: \"kubernetes.io/projected/ad0d63d6-1cda-4932-ada2-21f5e7494017-kube-api-access-kv7wt\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:45:37.248831 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.248635 2539 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad0d63d6-1cda-4932-ada2-21f5e7494017-kserve-provision-location\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:45:37.292130 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.292099 2539 generic.go:358] "Generic (PLEG): container finished" podID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerID="10c311c38b8cd63b949d51930ceb2076c32cc2da75615f7828c11604e8d4d81d" exitCode=0 Apr 28 19:45:37.292276 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.292138 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" event={"ID":"ad0d63d6-1cda-4932-ada2-21f5e7494017","Type":"ContainerDied","Data":"10c311c38b8cd63b949d51930ceb2076c32cc2da75615f7828c11604e8d4d81d"} Apr 28 19:45:37.292276 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.292161 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" event={"ID":"ad0d63d6-1cda-4932-ada2-21f5e7494017","Type":"ContainerDied","Data":"8224a1a18a8a160b65043e5e918a6636c6a422de282f452123ffa96660e4dce7"} Apr 28 19:45:37.292276 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.292170 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9" Apr 28 19:45:37.292407 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.292175 2539 scope.go:117] "RemoveContainer" containerID="ac1addc47d7799f06820933004eca4310a3352437e57393a440380560def398c" Apr 28 19:45:37.299904 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.299797 2539 scope.go:117] "RemoveContainer" containerID="10c311c38b8cd63b949d51930ceb2076c32cc2da75615f7828c11604e8d4d81d" Apr 28 19:45:37.306678 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.306660 2539 scope.go:117] "RemoveContainer" containerID="1a8332da7b4a5ef10c9d8639a8ea9effdeeb5c0f26e060ef666cf326558b52d4" Apr 28 19:45:37.312586 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.312563 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9"] Apr 28 19:45:37.314554 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.314533 2539 scope.go:117] "RemoveContainer" containerID="ac1addc47d7799f06820933004eca4310a3352437e57393a440380560def398c" Apr 28 19:45:37.314893 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:45:37.314867 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac1addc47d7799f06820933004eca4310a3352437e57393a440380560def398c\": container with ID starting with ac1addc47d7799f06820933004eca4310a3352437e57393a440380560def398c not found: ID does not exist" containerID="ac1addc47d7799f06820933004eca4310a3352437e57393a440380560def398c" Apr 28 19:45:37.315046 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.314922 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac1addc47d7799f06820933004eca4310a3352437e57393a440380560def398c"} err="failed to get container status \"ac1addc47d7799f06820933004eca4310a3352437e57393a440380560def398c\": rpc error: code = NotFound desc = could not find container \"ac1addc47d7799f06820933004eca4310a3352437e57393a440380560def398c\": container with ID starting with ac1addc47d7799f06820933004eca4310a3352437e57393a440380560def398c not found: ID does not exist" Apr 28 19:45:37.315046 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.314957 2539 scope.go:117] "RemoveContainer" containerID="10c311c38b8cd63b949d51930ceb2076c32cc2da75615f7828c11604e8d4d81d" Apr 28 19:45:37.315736 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:45:37.315717 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10c311c38b8cd63b949d51930ceb2076c32cc2da75615f7828c11604e8d4d81d\": container with ID starting with 10c311c38b8cd63b949d51930ceb2076c32cc2da75615f7828c11604e8d4d81d not found: ID does not exist" containerID="10c311c38b8cd63b949d51930ceb2076c32cc2da75615f7828c11604e8d4d81d" Apr 28 19:45:37.315822 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.315742 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10c311c38b8cd63b949d51930ceb2076c32cc2da75615f7828c11604e8d4d81d"} err="failed to get container status \"10c311c38b8cd63b949d51930ceb2076c32cc2da75615f7828c11604e8d4d81d\": rpc error: code = NotFound desc = could not find container \"10c311c38b8cd63b949d51930ceb2076c32cc2da75615f7828c11604e8d4d81d\": container with ID starting with 10c311c38b8cd63b949d51930ceb2076c32cc2da75615f7828c11604e8d4d81d not found: ID does not exist" Apr 28 19:45:37.315822 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.315758 2539 scope.go:117] "RemoveContainer" containerID="1a8332da7b4a5ef10c9d8639a8ea9effdeeb5c0f26e060ef666cf326558b52d4" Apr 28 19:45:37.316052 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:45:37.316030 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a8332da7b4a5ef10c9d8639a8ea9effdeeb5c0f26e060ef666cf326558b52d4\": container with ID starting with 1a8332da7b4a5ef10c9d8639a8ea9effdeeb5c0f26e060ef666cf326558b52d4 not found: ID does not exist" containerID="1a8332da7b4a5ef10c9d8639a8ea9effdeeb5c0f26e060ef666cf326558b52d4" Apr 28 19:45:37.316094 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.316062 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a8332da7b4a5ef10c9d8639a8ea9effdeeb5c0f26e060ef666cf326558b52d4"} err="failed to get container status \"1a8332da7b4a5ef10c9d8639a8ea9effdeeb5c0f26e060ef666cf326558b52d4\": rpc error: code = NotFound desc = could not find container \"1a8332da7b4a5ef10c9d8639a8ea9effdeeb5c0f26e060ef666cf326558b52d4\": container with ID starting with 1a8332da7b4a5ef10c9d8639a8ea9effdeeb5c0f26e060ef666cf326558b52d4 not found: ID does not exist" Apr 28 19:45:37.316513 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:37.316495 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-cx7b9"] Apr 28 19:45:38.274978 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:45:38.274938 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad0d63d6-1cda-4932-ada2-21f5e7494017" path="/var/lib/kubelet/pods/ad0d63d6-1cda-4932-ada2-21f5e7494017/volumes" Apr 28 19:46:14.286749 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:46:14.286717 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 19:46:14.288965 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:46:14.288944 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 19:47:19.159474 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.159436 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w"] Apr 28 19:47:19.159891 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.159761 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerName="storage-initializer" Apr 28 19:47:19.159891 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.159775 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerName="storage-initializer" Apr 28 19:47:19.159891 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.159793 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerName="kube-rbac-proxy" Apr 28 19:47:19.159891 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.159798 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerName="kube-rbac-proxy" Apr 28 19:47:19.159891 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.159806 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerName="kserve-container" Apr 28 19:47:19.159891 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.159812 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerName="kserve-container" Apr 28 19:47:19.159891 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.159821 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerName="kube-rbac-proxy" Apr 28 19:47:19.159891 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.159827 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerName="kube-rbac-proxy" Apr 28 19:47:19.159891 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.159837 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerName="kserve-container" Apr 28 19:47:19.159891 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.159846 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerName="kserve-container" Apr 28 19:47:19.159891 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.159857 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerName="storage-initializer" Apr 28 19:47:19.159891 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.159864 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerName="storage-initializer" Apr 28 19:47:19.160251 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.159917 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerName="kube-rbac-proxy" Apr 28 19:47:19.160251 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.159927 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerName="kube-rbac-proxy" Apr 28 19:47:19.160251 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.159933 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad0d63d6-1cda-4932-ada2-21f5e7494017" containerName="kserve-container" Apr 28 19:47:19.160251 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.159939 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1725c5e-89b3-42d7-aa6e-1bef5c37035f" containerName="kserve-container" Apr 28 19:47:19.162972 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.162949 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" Apr 28 19:47:19.164978 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.164957 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 28 19:47:19.165114 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.165091 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-44jch\"" Apr 28 19:47:19.165413 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.165118 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 19:47:19.165413 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.165137 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-predictor-serving-cert\"" Apr 28 19:47:19.165413 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.165163 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\"" Apr 28 19:47:19.172562 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.172539 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w"] Apr 28 19:47:19.250198 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.250163 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdtvb\" (UniqueName: \"kubernetes.io/projected/4fb0fd05-264d-4768-a6b8-c1f03d923740-kube-api-access-wdtvb\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-54b7w\" (UID: \"4fb0fd05-264d-4768-a6b8-c1f03d923740\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" Apr 28 19:47:19.250357 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.250210 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4fb0fd05-264d-4768-a6b8-c1f03d923740-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-54b7w\" (UID: \"4fb0fd05-264d-4768-a6b8-c1f03d923740\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" Apr 28 19:47:19.250357 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.250229 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4fb0fd05-264d-4768-a6b8-c1f03d923740-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-54b7w\" (UID: \"4fb0fd05-264d-4768-a6b8-c1f03d923740\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" Apr 28 19:47:19.250357 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.250301 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4fb0fd05-264d-4768-a6b8-c1f03d923740-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-54b7w\" (UID: \"4fb0fd05-264d-4768-a6b8-c1f03d923740\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" Apr 28 19:47:19.351632 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.351601 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4fb0fd05-264d-4768-a6b8-c1f03d923740-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-54b7w\" (UID: \"4fb0fd05-264d-4768-a6b8-c1f03d923740\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" Apr 28 19:47:19.351814 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.351692 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdtvb\" (UniqueName: \"kubernetes.io/projected/4fb0fd05-264d-4768-a6b8-c1f03d923740-kube-api-access-wdtvb\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-54b7w\" (UID: \"4fb0fd05-264d-4768-a6b8-c1f03d923740\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" Apr 28 19:47:19.351814 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.351739 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4fb0fd05-264d-4768-a6b8-c1f03d923740-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-54b7w\" (UID: \"4fb0fd05-264d-4768-a6b8-c1f03d923740\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" Apr 28 19:47:19.351814 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.351769 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4fb0fd05-264d-4768-a6b8-c1f03d923740-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-54b7w\" (UID: \"4fb0fd05-264d-4768-a6b8-c1f03d923740\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" Apr 28 19:47:19.352127 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.352096 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4fb0fd05-264d-4768-a6b8-c1f03d923740-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-54b7w\" (UID: \"4fb0fd05-264d-4768-a6b8-c1f03d923740\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" Apr 28 19:47:19.352313 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.352296 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4fb0fd05-264d-4768-a6b8-c1f03d923740-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-54b7w\" (UID: \"4fb0fd05-264d-4768-a6b8-c1f03d923740\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" Apr 28 19:47:19.354232 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.354214 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4fb0fd05-264d-4768-a6b8-c1f03d923740-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-54b7w\" (UID: \"4fb0fd05-264d-4768-a6b8-c1f03d923740\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" Apr 28 19:47:19.358784 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.358765 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdtvb\" (UniqueName: \"kubernetes.io/projected/4fb0fd05-264d-4768-a6b8-c1f03d923740-kube-api-access-wdtvb\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-54b7w\" (UID: \"4fb0fd05-264d-4768-a6b8-c1f03d923740\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" Apr 28 19:47:19.474367 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.474275 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" Apr 28 19:47:19.598674 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.598640 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w"] Apr 28 19:47:19.601735 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:47:19.601709 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fb0fd05_264d_4768_a6b8_c1f03d923740.slice/crio-d21388e9a0f32e9e494cb1473116a867d786bc15ec368880f2df996c40931d89 WatchSource:0}: Error finding container d21388e9a0f32e9e494cb1473116a867d786bc15ec368880f2df996c40931d89: Status 404 returned error can't find the container with id d21388e9a0f32e9e494cb1473116a867d786bc15ec368880f2df996c40931d89 Apr 28 19:47:19.603508 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:19.603489 2539 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:47:20.578779 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:20.578739 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" event={"ID":"4fb0fd05-264d-4768-a6b8-c1f03d923740","Type":"ContainerStarted","Data":"5dfea103df6045cfadf5fe4d9372458d01a79f5cfbab4d1eeeca71ddff4147e3"} Apr 28 19:47:20.578779 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:20.578781 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" event={"ID":"4fb0fd05-264d-4768-a6b8-c1f03d923740","Type":"ContainerStarted","Data":"d21388e9a0f32e9e494cb1473116a867d786bc15ec368880f2df996c40931d89"} Apr 28 19:47:23.588474 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:23.588439 2539 generic.go:358] "Generic (PLEG): container finished" podID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerID="5dfea103df6045cfadf5fe4d9372458d01a79f5cfbab4d1eeeca71ddff4147e3" exitCode=0 Apr 28 19:47:23.588849 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:23.588510 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" event={"ID":"4fb0fd05-264d-4768-a6b8-c1f03d923740","Type":"ContainerDied","Data":"5dfea103df6045cfadf5fe4d9372458d01a79f5cfbab4d1eeeca71ddff4147e3"} Apr 28 19:47:47.669531 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:47.669499 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" event={"ID":"4fb0fd05-264d-4768-a6b8-c1f03d923740","Type":"ContainerStarted","Data":"e60325361e7beef7988d46658683d5ef0045707dabb32f89965de9f7647c2988"} Apr 28 19:47:48.673898 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:48.673864 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" event={"ID":"4fb0fd05-264d-4768-a6b8-c1f03d923740","Type":"ContainerStarted","Data":"8f58337d3d0a2a89663dff07b10ff57344288fe234d824d32183da48b1535798"} Apr 28 19:47:48.674269 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:48.674088 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" Apr 28 19:47:48.674269 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:48.674243 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" Apr 28 19:47:48.675271 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:48.675246 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" podUID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 28 19:47:48.696019 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:48.695968 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" podStartSLOduration=5.789147312 podStartE2EDuration="29.695953994s" podCreationTimestamp="2026-04-28 19:47:19 +0000 UTC" firstStartedPulling="2026-04-28 19:47:23.589697765 +0000 UTC m=+1869.886053106" lastFinishedPulling="2026-04-28 19:47:47.496504445 +0000 UTC m=+1893.792859788" observedRunningTime="2026-04-28 19:47:48.693703957 +0000 UTC m=+1894.990059320" watchObservedRunningTime="2026-04-28 19:47:48.695953994 +0000 UTC m=+1894.992309358" Apr 28 19:47:49.677202 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:49.677166 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" podUID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 28 19:47:54.682333 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:54.682292 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" Apr 28 19:47:54.683064 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:47:54.683034 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" podUID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 28 19:48:04.683332 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:48:04.683287 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" podUID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 28 19:48:14.684025 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:48:14.683971 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" podUID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 28 19:48:24.683886 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:48:24.683839 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" podUID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 28 19:48:34.683531 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:48:34.683482 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" podUID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 28 19:48:44.683395 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:48:44.683326 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" podUID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 28 19:48:54.683806 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:48:54.683712 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" podUID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 28 19:49:04.683528 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:04.683499 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" Apr 28 19:49:13.330916 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:13.330879 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w"] Apr 28 19:49:13.331342 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:13.331249 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" podUID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerName="kserve-container" containerID="cri-o://e60325361e7beef7988d46658683d5ef0045707dabb32f89965de9f7647c2988" gracePeriod=30 Apr 28 19:49:13.331438 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:13.331308 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" podUID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerName="kube-rbac-proxy" containerID="cri-o://8f58337d3d0a2a89663dff07b10ff57344288fe234d824d32183da48b1535798" gracePeriod=30 Apr 28 19:49:13.913319 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:13.913285 2539 generic.go:358] "Generic (PLEG): container finished" podID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerID="8f58337d3d0a2a89663dff07b10ff57344288fe234d824d32183da48b1535798" exitCode=2 Apr 28 19:49:13.913521 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:13.913360 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" event={"ID":"4fb0fd05-264d-4768-a6b8-c1f03d923740","Type":"ContainerDied","Data":"8f58337d3d0a2a89663dff07b10ff57344288fe234d824d32183da48b1535798"} Apr 28 19:49:14.146261 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.146222 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl"] Apr 28 19:49:14.149552 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.149534 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" Apr 28 19:49:14.153315 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.153293 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-predictor-serving-cert\"" Apr 28 19:49:14.153625 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.153610 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\"" Apr 28 19:49:14.166797 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.166739 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl"] Apr 28 19:49:14.254475 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.254444 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f904b7bc-ee4c-4053-8382-8843663f8ec6-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl\" (UID: \"f904b7bc-ee4c-4053-8382-8843663f8ec6\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" Apr 28 19:49:14.254638 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.254506 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f904b7bc-ee4c-4053-8382-8843663f8ec6-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl\" (UID: \"f904b7bc-ee4c-4053-8382-8843663f8ec6\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" Apr 28 19:49:14.254638 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.254531 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdj79\" (UniqueName: \"kubernetes.io/projected/f904b7bc-ee4c-4053-8382-8843663f8ec6-kube-api-access-mdj79\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl\" (UID: \"f904b7bc-ee4c-4053-8382-8843663f8ec6\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" Apr 28 19:49:14.254638 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.254613 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f904b7bc-ee4c-4053-8382-8843663f8ec6-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl\" (UID: \"f904b7bc-ee4c-4053-8382-8843663f8ec6\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" Apr 28 19:49:14.355165 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.355117 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f904b7bc-ee4c-4053-8382-8843663f8ec6-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl\" (UID: \"f904b7bc-ee4c-4053-8382-8843663f8ec6\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" Apr 28 19:49:14.355630 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.355190 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f904b7bc-ee4c-4053-8382-8843663f8ec6-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl\" (UID: \"f904b7bc-ee4c-4053-8382-8843663f8ec6\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" Apr 28 19:49:14.355630 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.355216 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdj79\" (UniqueName: \"kubernetes.io/projected/f904b7bc-ee4c-4053-8382-8843663f8ec6-kube-api-access-mdj79\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl\" (UID: \"f904b7bc-ee4c-4053-8382-8843663f8ec6\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" Apr 28 19:49:14.355630 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.355309 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f904b7bc-ee4c-4053-8382-8843663f8ec6-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl\" (UID: \"f904b7bc-ee4c-4053-8382-8843663f8ec6\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" Apr 28 19:49:14.355630 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.355588 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f904b7bc-ee4c-4053-8382-8843663f8ec6-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl\" (UID: \"f904b7bc-ee4c-4053-8382-8843663f8ec6\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" Apr 28 19:49:14.357561 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.357533 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\"" Apr 28 19:49:14.357733 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.357721 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-predictor-serving-cert\"" Apr 28 19:49:14.363245 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.363221 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdj79\" (UniqueName: \"kubernetes.io/projected/f904b7bc-ee4c-4053-8382-8843663f8ec6-kube-api-access-mdj79\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl\" (UID: \"f904b7bc-ee4c-4053-8382-8843663f8ec6\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" Apr 28 19:49:14.366436 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.366415 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f904b7bc-ee4c-4053-8382-8843663f8ec6-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl\" (UID: \"f904b7bc-ee4c-4053-8382-8843663f8ec6\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" Apr 28 19:49:14.368270 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.368245 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f904b7bc-ee4c-4053-8382-8843663f8ec6-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl\" (UID: \"f904b7bc-ee4c-4053-8382-8843663f8ec6\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" Apr 28 19:49:14.458962 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.458880 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" Apr 28 19:49:14.579820 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.579785 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl"] Apr 28 19:49:14.583198 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:49:14.583165 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf904b7bc_ee4c_4053_8382_8843663f8ec6.slice/crio-650a8d27c21e2379cdb799238095629a6ce504cd72b5282e05ea0e474a459264 WatchSource:0}: Error finding container 650a8d27c21e2379cdb799238095629a6ce504cd72b5282e05ea0e474a459264: Status 404 returned error can't find the container with id 650a8d27c21e2379cdb799238095629a6ce504cd72b5282e05ea0e474a459264 Apr 28 19:49:14.677858 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.677813 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" podUID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.25:8643/healthz\": dial tcp 10.133.0.25:8643: connect: connection refused" Apr 28 19:49:14.683252 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.683224 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" podUID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 28 19:49:14.917814 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.917777 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" event={"ID":"f904b7bc-ee4c-4053-8382-8843663f8ec6","Type":"ContainerStarted","Data":"79cf404d549f6d3f434714107d76ccc74c631ab8cd411820b66447bd1a67c81f"} Apr 28 19:49:14.917992 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:14.917822 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" event={"ID":"f904b7bc-ee4c-4053-8382-8843663f8ec6","Type":"ContainerStarted","Data":"650a8d27c21e2379cdb799238095629a6ce504cd72b5282e05ea0e474a459264"} Apr 28 19:49:18.271737 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.271712 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" Apr 28 19:49:18.386320 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.386245 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4fb0fd05-264d-4768-a6b8-c1f03d923740-proxy-tls\") pod \"4fb0fd05-264d-4768-a6b8-c1f03d923740\" (UID: \"4fb0fd05-264d-4768-a6b8-c1f03d923740\") " Apr 28 19:49:18.386320 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.386299 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4fb0fd05-264d-4768-a6b8-c1f03d923740-kserve-provision-location\") pod \"4fb0fd05-264d-4768-a6b8-c1f03d923740\" (UID: \"4fb0fd05-264d-4768-a6b8-c1f03d923740\") " Apr 28 19:49:18.386514 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.386327 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4fb0fd05-264d-4768-a6b8-c1f03d923740-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"4fb0fd05-264d-4768-a6b8-c1f03d923740\" (UID: \"4fb0fd05-264d-4768-a6b8-c1f03d923740\") " Apr 28 19:49:18.386514 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.386396 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdtvb\" (UniqueName: \"kubernetes.io/projected/4fb0fd05-264d-4768-a6b8-c1f03d923740-kube-api-access-wdtvb\") pod \"4fb0fd05-264d-4768-a6b8-c1f03d923740\" (UID: \"4fb0fd05-264d-4768-a6b8-c1f03d923740\") " Apr 28 19:49:18.386677 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.386653 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fb0fd05-264d-4768-a6b8-c1f03d923740-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4fb0fd05-264d-4768-a6b8-c1f03d923740" (UID: "4fb0fd05-264d-4768-a6b8-c1f03d923740"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:49:18.386767 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.386741 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fb0fd05-264d-4768-a6b8-c1f03d923740-isvc-predictive-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-kube-rbac-proxy-sar-config") pod "4fb0fd05-264d-4768-a6b8-c1f03d923740" (UID: "4fb0fd05-264d-4768-a6b8-c1f03d923740"). InnerVolumeSpecName "isvc-predictive-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:49:18.388330 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.388305 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fb0fd05-264d-4768-a6b8-c1f03d923740-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4fb0fd05-264d-4768-a6b8-c1f03d923740" (UID: "4fb0fd05-264d-4768-a6b8-c1f03d923740"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:49:18.388513 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.388326 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fb0fd05-264d-4768-a6b8-c1f03d923740-kube-api-access-wdtvb" (OuterVolumeSpecName: "kube-api-access-wdtvb") pod "4fb0fd05-264d-4768-a6b8-c1f03d923740" (UID: "4fb0fd05-264d-4768-a6b8-c1f03d923740"). InnerVolumeSpecName "kube-api-access-wdtvb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:49:18.487901 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.487860 2539 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4fb0fd05-264d-4768-a6b8-c1f03d923740-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:49:18.487901 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.487897 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wdtvb\" (UniqueName: \"kubernetes.io/projected/4fb0fd05-264d-4768-a6b8-c1f03d923740-kube-api-access-wdtvb\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:49:18.488102 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.487912 2539 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4fb0fd05-264d-4768-a6b8-c1f03d923740-proxy-tls\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:49:18.488102 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.487927 2539 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4fb0fd05-264d-4768-a6b8-c1f03d923740-kserve-provision-location\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:49:18.931533 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.931498 2539 generic.go:358] "Generic (PLEG): container finished" podID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerID="e60325361e7beef7988d46658683d5ef0045707dabb32f89965de9f7647c2988" exitCode=0 Apr 28 19:49:18.931730 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.931589 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" Apr 28 19:49:18.931730 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.931590 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" event={"ID":"4fb0fd05-264d-4768-a6b8-c1f03d923740","Type":"ContainerDied","Data":"e60325361e7beef7988d46658683d5ef0045707dabb32f89965de9f7647c2988"} Apr 28 19:49:18.931730 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.931628 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w" event={"ID":"4fb0fd05-264d-4768-a6b8-c1f03d923740","Type":"ContainerDied","Data":"d21388e9a0f32e9e494cb1473116a867d786bc15ec368880f2df996c40931d89"} Apr 28 19:49:18.931730 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.931646 2539 scope.go:117] "RemoveContainer" containerID="8f58337d3d0a2a89663dff07b10ff57344288fe234d824d32183da48b1535798" Apr 28 19:49:18.933104 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.933083 2539 generic.go:358] "Generic (PLEG): container finished" podID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerID="79cf404d549f6d3f434714107d76ccc74c631ab8cd411820b66447bd1a67c81f" exitCode=0 Apr 28 19:49:18.933223 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.933138 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" event={"ID":"f904b7bc-ee4c-4053-8382-8843663f8ec6","Type":"ContainerDied","Data":"79cf404d549f6d3f434714107d76ccc74c631ab8cd411820b66447bd1a67c81f"} Apr 28 19:49:18.940274 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.940191 2539 scope.go:117] "RemoveContainer" containerID="e60325361e7beef7988d46658683d5ef0045707dabb32f89965de9f7647c2988" Apr 28 19:49:18.947449 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.947430 2539 scope.go:117] "RemoveContainer" containerID="5dfea103df6045cfadf5fe4d9372458d01a79f5cfbab4d1eeeca71ddff4147e3" Apr 28 19:49:18.954275 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.954258 2539 scope.go:117] "RemoveContainer" containerID="8f58337d3d0a2a89663dff07b10ff57344288fe234d824d32183da48b1535798" Apr 28 19:49:18.954523 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:49:18.954504 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f58337d3d0a2a89663dff07b10ff57344288fe234d824d32183da48b1535798\": container with ID starting with 8f58337d3d0a2a89663dff07b10ff57344288fe234d824d32183da48b1535798 not found: ID does not exist" containerID="8f58337d3d0a2a89663dff07b10ff57344288fe234d824d32183da48b1535798" Apr 28 19:49:18.954580 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.954532 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f58337d3d0a2a89663dff07b10ff57344288fe234d824d32183da48b1535798"} err="failed to get container status \"8f58337d3d0a2a89663dff07b10ff57344288fe234d824d32183da48b1535798\": rpc error: code = NotFound desc = could not find container \"8f58337d3d0a2a89663dff07b10ff57344288fe234d824d32183da48b1535798\": container with ID starting with 8f58337d3d0a2a89663dff07b10ff57344288fe234d824d32183da48b1535798 not found: ID does not exist" Apr 28 19:49:18.954580 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.954552 2539 scope.go:117] "RemoveContainer" containerID="e60325361e7beef7988d46658683d5ef0045707dabb32f89965de9f7647c2988" Apr 28 19:49:18.954740 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:49:18.954719 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e60325361e7beef7988d46658683d5ef0045707dabb32f89965de9f7647c2988\": container with ID starting with e60325361e7beef7988d46658683d5ef0045707dabb32f89965de9f7647c2988 not found: ID does not exist" containerID="e60325361e7beef7988d46658683d5ef0045707dabb32f89965de9f7647c2988" Apr 28 19:49:18.954783 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.954744 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e60325361e7beef7988d46658683d5ef0045707dabb32f89965de9f7647c2988"} err="failed to get container status \"e60325361e7beef7988d46658683d5ef0045707dabb32f89965de9f7647c2988\": rpc error: code = NotFound desc = could not find container \"e60325361e7beef7988d46658683d5ef0045707dabb32f89965de9f7647c2988\": container with ID starting with e60325361e7beef7988d46658683d5ef0045707dabb32f89965de9f7647c2988 not found: ID does not exist" Apr 28 19:49:18.954783 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.954758 2539 scope.go:117] "RemoveContainer" containerID="5dfea103df6045cfadf5fe4d9372458d01a79f5cfbab4d1eeeca71ddff4147e3" Apr 28 19:49:18.954985 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:49:18.954970 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dfea103df6045cfadf5fe4d9372458d01a79f5cfbab4d1eeeca71ddff4147e3\": container with ID starting with 5dfea103df6045cfadf5fe4d9372458d01a79f5cfbab4d1eeeca71ddff4147e3 not found: ID does not exist" containerID="5dfea103df6045cfadf5fe4d9372458d01a79f5cfbab4d1eeeca71ddff4147e3" Apr 28 19:49:18.955032 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.954988 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dfea103df6045cfadf5fe4d9372458d01a79f5cfbab4d1eeeca71ddff4147e3"} err="failed to get container status \"5dfea103df6045cfadf5fe4d9372458d01a79f5cfbab4d1eeeca71ddff4147e3\": rpc error: code = NotFound desc = could not find container \"5dfea103df6045cfadf5fe4d9372458d01a79f5cfbab4d1eeeca71ddff4147e3\": container with ID starting with 5dfea103df6045cfadf5fe4d9372458d01a79f5cfbab4d1eeeca71ddff4147e3 not found: ID does not exist" Apr 28 19:49:18.968716 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.968694 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w"] Apr 28 19:49:18.972148 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:18.972124 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-54b7w"] Apr 28 19:49:19.938352 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:19.938317 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" event={"ID":"f904b7bc-ee4c-4053-8382-8843663f8ec6","Type":"ContainerStarted","Data":"ff31ea075f9a7fae6fc5770e737ab191857e1ae4c8b38bee0c441f7f25df9080"} Apr 28 19:49:19.938797 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:19.938367 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" event={"ID":"f904b7bc-ee4c-4053-8382-8843663f8ec6","Type":"ContainerStarted","Data":"84fd77e2fc3aa0f1fe8d63deb18eeb2298d783fec34b86e4ae5a8c6d8000195c"} Apr 28 19:49:19.938797 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:19.938637 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" Apr 28 19:49:19.957026 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:19.956982 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" podStartSLOduration=5.9569678790000005 podStartE2EDuration="5.956967879s" podCreationTimestamp="2026-04-28 19:49:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:49:19.955735904 +0000 UTC m=+1986.252091291" watchObservedRunningTime="2026-04-28 19:49:19.956967879 +0000 UTC m=+1986.253323243" Apr 28 19:49:20.271487 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:20.271451 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fb0fd05-264d-4768-a6b8-c1f03d923740" path="/var/lib/kubelet/pods/4fb0fd05-264d-4768-a6b8-c1f03d923740/volumes" Apr 28 19:49:20.942599 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:20.942568 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" Apr 28 19:49:20.943850 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:20.943825 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 28 19:49:21.946195 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:21.946152 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 28 19:49:26.951946 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:26.951919 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" Apr 28 19:49:26.952411 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:26.952363 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 28 19:49:36.953241 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:36.953198 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 28 19:49:46.952825 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:46.952782 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 28 19:49:56.952424 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:49:56.952354 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 28 19:50:06.953249 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:06.953208 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 28 19:50:16.953102 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:16.953060 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 28 19:50:26.953425 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:26.953299 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 28 19:50:30.268070 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:30.268023 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 28 19:50:40.271594 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:40.271567 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" Apr 28 19:50:48.057171 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.057131 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl"] Apr 28 19:50:48.057661 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.057445 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerName="kserve-container" containerID="cri-o://84fd77e2fc3aa0f1fe8d63deb18eeb2298d783fec34b86e4ae5a8c6d8000195c" gracePeriod=30 Apr 28 19:50:48.057661 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.057488 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerName="kube-rbac-proxy" containerID="cri-o://ff31ea075f9a7fae6fc5770e737ab191857e1ae4c8b38bee0c441f7f25df9080" gracePeriod=30 Apr 28 19:50:48.203355 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.203320 2539 generic.go:358] "Generic (PLEG): container finished" podID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerID="ff31ea075f9a7fae6fc5770e737ab191857e1ae4c8b38bee0c441f7f25df9080" exitCode=2 Apr 28 19:50:48.203528 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.203403 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" event={"ID":"f904b7bc-ee4c-4053-8382-8843663f8ec6","Type":"ContainerDied","Data":"ff31ea075f9a7fae6fc5770e737ab191857e1ae4c8b38bee0c441f7f25df9080"} Apr 28 19:50:48.338279 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.338189 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc"] Apr 28 19:50:48.338510 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.338495 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerName="kube-rbac-proxy" Apr 28 19:50:48.338510 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.338509 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerName="kube-rbac-proxy" Apr 28 19:50:48.338628 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.338528 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerName="storage-initializer" Apr 28 19:50:48.338628 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.338534 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerName="storage-initializer" Apr 28 19:50:48.338628 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.338541 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerName="kserve-container" Apr 28 19:50:48.338628 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.338548 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerName="kserve-container" Apr 28 19:50:48.338628 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.338591 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerName="kube-rbac-proxy" Apr 28 19:50:48.338628 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.338601 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="4fb0fd05-264d-4768-a6b8-c1f03d923740" containerName="kserve-container" Apr 28 19:50:48.341366 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.341350 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" Apr 28 19:50:48.343457 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.343432 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-predictor-serving-cert\"" Apr 28 19:50:48.343557 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.343432 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\"" Apr 28 19:50:48.351972 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.351950 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc"] Apr 28 19:50:48.442284 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.442246 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc\" (UID: \"b20f856f-2dad-4bc0-a1b3-b87db891b6c5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" Apr 28 19:50:48.442485 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.442357 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc\" (UID: \"b20f856f-2dad-4bc0-a1b3-b87db891b6c5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" Apr 28 19:50:48.442485 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.442434 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqksd\" (UniqueName: \"kubernetes.io/projected/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-kube-api-access-qqksd\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc\" (UID: \"b20f856f-2dad-4bc0-a1b3-b87db891b6c5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" Apr 28 19:50:48.442485 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.442465 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc\" (UID: \"b20f856f-2dad-4bc0-a1b3-b87db891b6c5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" Apr 28 19:50:48.543518 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.543481 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqksd\" (UniqueName: \"kubernetes.io/projected/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-kube-api-access-qqksd\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc\" (UID: \"b20f856f-2dad-4bc0-a1b3-b87db891b6c5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" Apr 28 19:50:48.543715 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.543531 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc\" (UID: \"b20f856f-2dad-4bc0-a1b3-b87db891b6c5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" Apr 28 19:50:48.543715 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.543658 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc\" (UID: \"b20f856f-2dad-4bc0-a1b3-b87db891b6c5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" Apr 28 19:50:48.543828 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.543811 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc\" (UID: \"b20f856f-2dad-4bc0-a1b3-b87db891b6c5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" Apr 28 19:50:48.543966 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.543943 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc\" (UID: \"b20f856f-2dad-4bc0-a1b3-b87db891b6c5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" Apr 28 19:50:48.544345 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.544323 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc\" (UID: \"b20f856f-2dad-4bc0-a1b3-b87db891b6c5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" Apr 28 19:50:48.546101 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.546082 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc\" (UID: \"b20f856f-2dad-4bc0-a1b3-b87db891b6c5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" Apr 28 19:50:48.551851 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.551829 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqksd\" (UniqueName: \"kubernetes.io/projected/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-kube-api-access-qqksd\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc\" (UID: \"b20f856f-2dad-4bc0-a1b3-b87db891b6c5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" Apr 28 19:50:48.651447 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.651333 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" Apr 28 19:50:48.781559 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:48.781517 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc"] Apr 28 19:50:48.784689 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:50:48.784663 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb20f856f_2dad_4bc0_a1b3_b87db891b6c5.slice/crio-38244b4f314274835211a8cee4fd7a70549775365cf793c1a7748fec7f853ca6 WatchSource:0}: Error finding container 38244b4f314274835211a8cee4fd7a70549775365cf793c1a7748fec7f853ca6: Status 404 returned error can't find the container with id 38244b4f314274835211a8cee4fd7a70549775365cf793c1a7748fec7f853ca6 Apr 28 19:50:49.207826 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:49.207790 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" event={"ID":"b20f856f-2dad-4bc0-a1b3-b87db891b6c5","Type":"ContainerStarted","Data":"c8311f5fa4927c924683ed08d2d97dd4a5256dbd6658d5cfe552e5636acb3ce0"} Apr 28 19:50:49.207826 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:49.207828 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" event={"ID":"b20f856f-2dad-4bc0-a1b3-b87db891b6c5","Type":"ContainerStarted","Data":"38244b4f314274835211a8cee4fd7a70549775365cf793c1a7748fec7f853ca6"} Apr 28 19:50:50.268630 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:50.268595 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 28 19:50:51.947269 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:51.947224 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.26:8643/healthz\": dial tcp 10.133.0.26:8643: connect: connection refused" Apr 28 19:50:53.002135 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.002113 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" Apr 28 19:50:53.082141 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.082106 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdj79\" (UniqueName: \"kubernetes.io/projected/f904b7bc-ee4c-4053-8382-8843663f8ec6-kube-api-access-mdj79\") pod \"f904b7bc-ee4c-4053-8382-8843663f8ec6\" (UID: \"f904b7bc-ee4c-4053-8382-8843663f8ec6\") " Apr 28 19:50:53.082309 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.082151 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f904b7bc-ee4c-4053-8382-8843663f8ec6-proxy-tls\") pod \"f904b7bc-ee4c-4053-8382-8843663f8ec6\" (UID: \"f904b7bc-ee4c-4053-8382-8843663f8ec6\") " Apr 28 19:50:53.082309 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.082187 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f904b7bc-ee4c-4053-8382-8843663f8ec6-kserve-provision-location\") pod \"f904b7bc-ee4c-4053-8382-8843663f8ec6\" (UID: \"f904b7bc-ee4c-4053-8382-8843663f8ec6\") " Apr 28 19:50:53.082309 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.082240 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f904b7bc-ee4c-4053-8382-8843663f8ec6-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"f904b7bc-ee4c-4053-8382-8843663f8ec6\" (UID: \"f904b7bc-ee4c-4053-8382-8843663f8ec6\") " Apr 28 19:50:53.082552 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.082528 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f904b7bc-ee4c-4053-8382-8843663f8ec6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f904b7bc-ee4c-4053-8382-8843663f8ec6" (UID: "f904b7bc-ee4c-4053-8382-8843663f8ec6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:50:53.082662 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.082634 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f904b7bc-ee4c-4053-8382-8843663f8ec6-isvc-predictive-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-kube-rbac-proxy-sar-config") pod "f904b7bc-ee4c-4053-8382-8843663f8ec6" (UID: "f904b7bc-ee4c-4053-8382-8843663f8ec6"). InnerVolumeSpecName "isvc-predictive-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:50:53.084366 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.084342 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f904b7bc-ee4c-4053-8382-8843663f8ec6-kube-api-access-mdj79" (OuterVolumeSpecName: "kube-api-access-mdj79") pod "f904b7bc-ee4c-4053-8382-8843663f8ec6" (UID: "f904b7bc-ee4c-4053-8382-8843663f8ec6"). InnerVolumeSpecName "kube-api-access-mdj79". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:50:53.084470 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.084395 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f904b7bc-ee4c-4053-8382-8843663f8ec6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f904b7bc-ee4c-4053-8382-8843663f8ec6" (UID: "f904b7bc-ee4c-4053-8382-8843663f8ec6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:50:53.182969 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.182892 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mdj79\" (UniqueName: \"kubernetes.io/projected/f904b7bc-ee4c-4053-8382-8843663f8ec6-kube-api-access-mdj79\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:50:53.182969 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.182921 2539 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f904b7bc-ee4c-4053-8382-8843663f8ec6-proxy-tls\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:50:53.182969 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.182931 2539 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f904b7bc-ee4c-4053-8382-8843663f8ec6-kserve-provision-location\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:50:53.182969 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.182941 2539 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f904b7bc-ee4c-4053-8382-8843663f8ec6-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:50:53.220732 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.220703 2539 generic.go:358] "Generic (PLEG): container finished" podID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerID="84fd77e2fc3aa0f1fe8d63deb18eeb2298d783fec34b86e4ae5a8c6d8000195c" exitCode=0 Apr 28 19:50:53.220898 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.220767 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" event={"ID":"f904b7bc-ee4c-4053-8382-8843663f8ec6","Type":"ContainerDied","Data":"84fd77e2fc3aa0f1fe8d63deb18eeb2298d783fec34b86e4ae5a8c6d8000195c"} Apr 28 19:50:53.220898 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.220797 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" Apr 28 19:50:53.220898 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.220813 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl" event={"ID":"f904b7bc-ee4c-4053-8382-8843663f8ec6","Type":"ContainerDied","Data":"650a8d27c21e2379cdb799238095629a6ce504cd72b5282e05ea0e474a459264"} Apr 28 19:50:53.220898 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.220835 2539 scope.go:117] "RemoveContainer" containerID="ff31ea075f9a7fae6fc5770e737ab191857e1ae4c8b38bee0c441f7f25df9080" Apr 28 19:50:53.222211 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.222186 2539 generic.go:358] "Generic (PLEG): container finished" podID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerID="c8311f5fa4927c924683ed08d2d97dd4a5256dbd6658d5cfe552e5636acb3ce0" exitCode=0 Apr 28 19:50:53.222303 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.222244 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" event={"ID":"b20f856f-2dad-4bc0-a1b3-b87db891b6c5","Type":"ContainerDied","Data":"c8311f5fa4927c924683ed08d2d97dd4a5256dbd6658d5cfe552e5636acb3ce0"} Apr 28 19:50:53.229250 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.229218 2539 scope.go:117] "RemoveContainer" containerID="84fd77e2fc3aa0f1fe8d63deb18eeb2298d783fec34b86e4ae5a8c6d8000195c" Apr 28 19:50:53.236523 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.236500 2539 scope.go:117] "RemoveContainer" containerID="79cf404d549f6d3f434714107d76ccc74c631ab8cd411820b66447bd1a67c81f" Apr 28 19:50:53.243347 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.243325 2539 scope.go:117] "RemoveContainer" containerID="ff31ea075f9a7fae6fc5770e737ab191857e1ae4c8b38bee0c441f7f25df9080" Apr 28 19:50:53.243692 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:50:53.243667 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff31ea075f9a7fae6fc5770e737ab191857e1ae4c8b38bee0c441f7f25df9080\": container with ID starting with ff31ea075f9a7fae6fc5770e737ab191857e1ae4c8b38bee0c441f7f25df9080 not found: ID does not exist" containerID="ff31ea075f9a7fae6fc5770e737ab191857e1ae4c8b38bee0c441f7f25df9080" Apr 28 19:50:53.243772 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.243705 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff31ea075f9a7fae6fc5770e737ab191857e1ae4c8b38bee0c441f7f25df9080"} err="failed to get container status \"ff31ea075f9a7fae6fc5770e737ab191857e1ae4c8b38bee0c441f7f25df9080\": rpc error: code = NotFound desc = could not find container \"ff31ea075f9a7fae6fc5770e737ab191857e1ae4c8b38bee0c441f7f25df9080\": container with ID starting with ff31ea075f9a7fae6fc5770e737ab191857e1ae4c8b38bee0c441f7f25df9080 not found: ID does not exist" Apr 28 19:50:53.243772 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.243733 2539 scope.go:117] "RemoveContainer" containerID="84fd77e2fc3aa0f1fe8d63deb18eeb2298d783fec34b86e4ae5a8c6d8000195c" Apr 28 19:50:53.244007 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:50:53.243986 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84fd77e2fc3aa0f1fe8d63deb18eeb2298d783fec34b86e4ae5a8c6d8000195c\": container with ID starting with 84fd77e2fc3aa0f1fe8d63deb18eeb2298d783fec34b86e4ae5a8c6d8000195c not found: ID does not exist" containerID="84fd77e2fc3aa0f1fe8d63deb18eeb2298d783fec34b86e4ae5a8c6d8000195c" Apr 28 19:50:53.244048 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.244013 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84fd77e2fc3aa0f1fe8d63deb18eeb2298d783fec34b86e4ae5a8c6d8000195c"} err="failed to get container status \"84fd77e2fc3aa0f1fe8d63deb18eeb2298d783fec34b86e4ae5a8c6d8000195c\": rpc error: code = NotFound desc = could not find container \"84fd77e2fc3aa0f1fe8d63deb18eeb2298d783fec34b86e4ae5a8c6d8000195c\": container with ID starting with 84fd77e2fc3aa0f1fe8d63deb18eeb2298d783fec34b86e4ae5a8c6d8000195c not found: ID does not exist" Apr 28 19:50:53.244048 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.244031 2539 scope.go:117] "RemoveContainer" containerID="79cf404d549f6d3f434714107d76ccc74c631ab8cd411820b66447bd1a67c81f" Apr 28 19:50:53.244303 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:50:53.244283 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79cf404d549f6d3f434714107d76ccc74c631ab8cd411820b66447bd1a67c81f\": container with ID starting with 79cf404d549f6d3f434714107d76ccc74c631ab8cd411820b66447bd1a67c81f not found: ID does not exist" containerID="79cf404d549f6d3f434714107d76ccc74c631ab8cd411820b66447bd1a67c81f" Apr 28 19:50:53.244364 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.244311 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79cf404d549f6d3f434714107d76ccc74c631ab8cd411820b66447bd1a67c81f"} err="failed to get container status \"79cf404d549f6d3f434714107d76ccc74c631ab8cd411820b66447bd1a67c81f\": rpc error: code = NotFound desc = could not find container \"79cf404d549f6d3f434714107d76ccc74c631ab8cd411820b66447bd1a67c81f\": container with ID starting with 79cf404d549f6d3f434714107d76ccc74c631ab8cd411820b66447bd1a67c81f not found: ID does not exist" Apr 28 19:50:53.270609 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.270584 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl"] Apr 28 19:50:53.273758 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:53.273732 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6gsl"] Apr 28 19:50:54.227426 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:54.227363 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" event={"ID":"b20f856f-2dad-4bc0-a1b3-b87db891b6c5","Type":"ContainerStarted","Data":"cd08c57a916e76bc62193aa2219a3448f79742b748472b43d1456eb53c15754f"} Apr 28 19:50:54.227426 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:54.227428 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" event={"ID":"b20f856f-2dad-4bc0-a1b3-b87db891b6c5","Type":"ContainerStarted","Data":"9b16481b380e59aa81aeadb7905c0ef11f3e6cf718b5a75e8d77fb00aac8f761"} Apr 28 19:50:54.227967 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:54.227715 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" Apr 28 19:50:54.227967 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:54.227855 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" Apr 28 19:50:54.229066 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:54.229037 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" podUID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 28 19:50:54.259295 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:54.259236 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" podStartSLOduration=6.259221936 podStartE2EDuration="6.259221936s" podCreationTimestamp="2026-04-28 19:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:50:54.256982064 +0000 UTC m=+2080.553337428" watchObservedRunningTime="2026-04-28 19:50:54.259221936 +0000 UTC m=+2080.555577300" Apr 28 19:50:54.271772 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:54.271743 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" path="/var/lib/kubelet/pods/f904b7bc-ee4c-4053-8382-8843663f8ec6/volumes" Apr 28 19:50:55.230530 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:50:55.230492 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" podUID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 28 19:51:00.235300 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:51:00.235272 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" Apr 28 19:51:00.235917 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:51:00.235890 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" podUID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 28 19:51:10.235782 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:51:10.235738 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" podUID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 28 19:51:14.305547 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:51:14.305518 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 19:51:14.309237 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:51:14.309211 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 19:51:20.236407 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:51:20.236350 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" podUID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 28 19:51:30.236329 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:51:30.236288 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" podUID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 28 19:51:40.236649 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:51:40.236610 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" podUID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 28 19:51:50.236498 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:51:50.236457 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" podUID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 28 19:52:00.236284 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:00.236199 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" podUID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 28 19:52:10.237218 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:10.237187 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" Apr 28 19:52:18.424550 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.424517 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc"] Apr 28 19:52:18.425090 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.424834 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" podUID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerName="kserve-container" containerID="cri-o://9b16481b380e59aa81aeadb7905c0ef11f3e6cf718b5a75e8d77fb00aac8f761" gracePeriod=30 Apr 28 19:52:18.425090 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.424859 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" podUID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerName="kube-rbac-proxy" containerID="cri-o://cd08c57a916e76bc62193aa2219a3448f79742b748472b43d1456eb53c15754f" gracePeriod=30 Apr 28 19:52:18.736504 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.736428 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9"] Apr 28 19:52:18.736748 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.736736 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerName="kube-rbac-proxy" Apr 28 19:52:18.736798 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.736750 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerName="kube-rbac-proxy" Apr 28 19:52:18.736798 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.736762 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerName="kserve-container" Apr 28 19:52:18.736798 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.736767 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerName="kserve-container" Apr 28 19:52:18.736798 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.736776 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerName="storage-initializer" Apr 28 19:52:18.736798 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.736784 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerName="storage-initializer" Apr 28 19:52:18.736957 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.736844 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerName="kube-rbac-proxy" Apr 28 19:52:18.736957 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.736853 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="f904b7bc-ee4c-4053-8382-8843663f8ec6" containerName="kserve-container" Apr 28 19:52:18.739826 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.739810 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" Apr 28 19:52:18.741898 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.741880 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-predictor-serving-cert\"" Apr 28 19:52:18.741997 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.741901 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 28 19:52:18.748705 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.748683 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9"] Apr 28 19:52:18.806297 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.806262 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d873ed12-cd11-498a-8ff7-98d4bce6013c-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9\" (UID: \"d873ed12-cd11-498a-8ff7-98d4bce6013c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" Apr 28 19:52:18.806514 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.806305 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d873ed12-cd11-498a-8ff7-98d4bce6013c-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9\" (UID: \"d873ed12-cd11-498a-8ff7-98d4bce6013c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" Apr 28 19:52:18.806514 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.806418 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d873ed12-cd11-498a-8ff7-98d4bce6013c-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9\" (UID: \"d873ed12-cd11-498a-8ff7-98d4bce6013c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" Apr 28 19:52:18.806514 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.806450 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w44w8\" (UniqueName: \"kubernetes.io/projected/d873ed12-cd11-498a-8ff7-98d4bce6013c-kube-api-access-w44w8\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9\" (UID: \"d873ed12-cd11-498a-8ff7-98d4bce6013c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" Apr 28 19:52:18.907228 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.907188 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d873ed12-cd11-498a-8ff7-98d4bce6013c-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9\" (UID: \"d873ed12-cd11-498a-8ff7-98d4bce6013c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" Apr 28 19:52:18.907228 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.907233 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d873ed12-cd11-498a-8ff7-98d4bce6013c-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9\" (UID: \"d873ed12-cd11-498a-8ff7-98d4bce6013c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" Apr 28 19:52:18.907492 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.907284 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d873ed12-cd11-498a-8ff7-98d4bce6013c-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9\" (UID: \"d873ed12-cd11-498a-8ff7-98d4bce6013c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" Apr 28 19:52:18.907492 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.907304 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w44w8\" (UniqueName: \"kubernetes.io/projected/d873ed12-cd11-498a-8ff7-98d4bce6013c-kube-api-access-w44w8\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9\" (UID: \"d873ed12-cd11-498a-8ff7-98d4bce6013c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" Apr 28 19:52:18.907644 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.907626 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d873ed12-cd11-498a-8ff7-98d4bce6013c-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9\" (UID: \"d873ed12-cd11-498a-8ff7-98d4bce6013c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" Apr 28 19:52:18.907926 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.907899 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d873ed12-cd11-498a-8ff7-98d4bce6013c-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9\" (UID: \"d873ed12-cd11-498a-8ff7-98d4bce6013c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" Apr 28 19:52:18.909781 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.909759 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d873ed12-cd11-498a-8ff7-98d4bce6013c-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9\" (UID: \"d873ed12-cd11-498a-8ff7-98d4bce6013c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" Apr 28 19:52:18.914635 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:18.914610 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w44w8\" (UniqueName: \"kubernetes.io/projected/d873ed12-cd11-498a-8ff7-98d4bce6013c-kube-api-access-w44w8\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9\" (UID: \"d873ed12-cd11-498a-8ff7-98d4bce6013c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" Apr 28 19:52:19.049990 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:19.049952 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" Apr 28 19:52:19.173289 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:19.173240 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9"] Apr 28 19:52:19.175455 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:52:19.175425 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd873ed12_cd11_498a_8ff7_98d4bce6013c.slice/crio-b3e93c4e0e4ed0cf0b573b7152bc60d0149e83d7e63d4d46f33d1709a3f74a1e WatchSource:0}: Error finding container b3e93c4e0e4ed0cf0b573b7152bc60d0149e83d7e63d4d46f33d1709a3f74a1e: Status 404 returned error can't find the container with id b3e93c4e0e4ed0cf0b573b7152bc60d0149e83d7e63d4d46f33d1709a3f74a1e Apr 28 19:52:19.468197 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:19.468120 2539 generic.go:358] "Generic (PLEG): container finished" podID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerID="cd08c57a916e76bc62193aa2219a3448f79742b748472b43d1456eb53c15754f" exitCode=2 Apr 28 19:52:19.468614 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:19.468195 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" event={"ID":"b20f856f-2dad-4bc0-a1b3-b87db891b6c5","Type":"ContainerDied","Data":"cd08c57a916e76bc62193aa2219a3448f79742b748472b43d1456eb53c15754f"} Apr 28 19:52:19.469468 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:19.469437 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" event={"ID":"d873ed12-cd11-498a-8ff7-98d4bce6013c","Type":"ContainerStarted","Data":"cf66230219e0300e924eb072ccde1d9db7c4fe1ef96cdfc31462ee9efd4df1d1"} Apr 28 19:52:19.469468 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:19.469468 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" event={"ID":"d873ed12-cd11-498a-8ff7-98d4bce6013c","Type":"ContainerStarted","Data":"b3e93c4e0e4ed0cf0b573b7152bc60d0149e83d7e63d4d46f33d1709a3f74a1e"} Apr 28 19:52:20.230930 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:20.230886 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" podUID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 28 19:52:20.236684 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:20.236659 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" podUID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 28 19:52:23.484357 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:23.484274 2539 generic.go:358] "Generic (PLEG): container finished" podID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerID="9b16481b380e59aa81aeadb7905c0ef11f3e6cf718b5a75e8d77fb00aac8f761" exitCode=0 Apr 28 19:52:23.484856 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:23.484351 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" event={"ID":"b20f856f-2dad-4bc0-a1b3-b87db891b6c5","Type":"ContainerDied","Data":"9b16481b380e59aa81aeadb7905c0ef11f3e6cf718b5a75e8d77fb00aac8f761"} Apr 28 19:52:23.485627 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:23.485607 2539 generic.go:358] "Generic (PLEG): container finished" podID="d873ed12-cd11-498a-8ff7-98d4bce6013c" containerID="cf66230219e0300e924eb072ccde1d9db7c4fe1ef96cdfc31462ee9efd4df1d1" exitCode=0 Apr 28 19:52:23.485733 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:23.485641 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" event={"ID":"d873ed12-cd11-498a-8ff7-98d4bce6013c","Type":"ContainerDied","Data":"cf66230219e0300e924eb072ccde1d9db7c4fe1ef96cdfc31462ee9efd4df1d1"} Apr 28 19:52:23.486852 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:23.486837 2539 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:52:23.669818 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:23.669794 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" Apr 28 19:52:23.746958 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:23.746931 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-kserve-provision-location\") pod \"b20f856f-2dad-4bc0-a1b3-b87db891b6c5\" (UID: \"b20f856f-2dad-4bc0-a1b3-b87db891b6c5\") " Apr 28 19:52:23.747147 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:23.746992 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-proxy-tls\") pod \"b20f856f-2dad-4bc0-a1b3-b87db891b6c5\" (UID: \"b20f856f-2dad-4bc0-a1b3-b87db891b6c5\") " Apr 28 19:52:23.747147 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:23.747051 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqksd\" (UniqueName: \"kubernetes.io/projected/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-kube-api-access-qqksd\") pod \"b20f856f-2dad-4bc0-a1b3-b87db891b6c5\" (UID: \"b20f856f-2dad-4bc0-a1b3-b87db891b6c5\") " Apr 28 19:52:23.747147 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:23.747111 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"b20f856f-2dad-4bc0-a1b3-b87db891b6c5\" (UID: \"b20f856f-2dad-4bc0-a1b3-b87db891b6c5\") " Apr 28 19:52:23.747315 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:23.747286 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b20f856f-2dad-4bc0-a1b3-b87db891b6c5" (UID: "b20f856f-2dad-4bc0-a1b3-b87db891b6c5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:52:23.747521 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:23.747497 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config") pod "b20f856f-2dad-4bc0-a1b3-b87db891b6c5" (UID: "b20f856f-2dad-4bc0-a1b3-b87db891b6c5"). InnerVolumeSpecName "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:52:23.749128 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:23.749099 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b20f856f-2dad-4bc0-a1b3-b87db891b6c5" (UID: "b20f856f-2dad-4bc0-a1b3-b87db891b6c5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:52:23.749233 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:23.749156 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-kube-api-access-qqksd" (OuterVolumeSpecName: "kube-api-access-qqksd") pod "b20f856f-2dad-4bc0-a1b3-b87db891b6c5" (UID: "b20f856f-2dad-4bc0-a1b3-b87db891b6c5"). InnerVolumeSpecName "kube-api-access-qqksd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:52:23.848413 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:23.848365 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qqksd\" (UniqueName: \"kubernetes.io/projected/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-kube-api-access-qqksd\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:52:23.848413 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:23.848411 2539 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:52:23.848413 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:23.848421 2539 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-kserve-provision-location\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:52:23.848642 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:23.848430 2539 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b20f856f-2dad-4bc0-a1b3-b87db891b6c5-proxy-tls\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:52:24.495468 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:24.495416 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" event={"ID":"b20f856f-2dad-4bc0-a1b3-b87db891b6c5","Type":"ContainerDied","Data":"38244b4f314274835211a8cee4fd7a70549775365cf793c1a7748fec7f853ca6"} Apr 28 19:52:24.495939 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:24.495505 2539 scope.go:117] "RemoveContainer" containerID="cd08c57a916e76bc62193aa2219a3448f79742b748472b43d1456eb53c15754f" Apr 28 19:52:24.495939 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:24.495533 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc" Apr 28 19:52:24.497713 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:24.497683 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" event={"ID":"d873ed12-cd11-498a-8ff7-98d4bce6013c","Type":"ContainerStarted","Data":"c4d98398d44d7f895f7dc118b738a75dd6dc74ebbe535fc21637c60305c119b5"} Apr 28 19:52:24.497843 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:24.497726 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" event={"ID":"d873ed12-cd11-498a-8ff7-98d4bce6013c","Type":"ContainerStarted","Data":"025feb9c12fb46364b3def8e9dfb8771f4d86c71ec3638c7a06e4d9fb29c20a4"} Apr 28 19:52:24.497981 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:24.497961 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" Apr 28 19:52:24.498068 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:24.497991 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" Apr 28 19:52:24.503657 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:24.503640 2539 scope.go:117] "RemoveContainer" containerID="9b16481b380e59aa81aeadb7905c0ef11f3e6cf718b5a75e8d77fb00aac8f761" Apr 28 19:52:24.512746 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:24.512725 2539 scope.go:117] "RemoveContainer" containerID="c8311f5fa4927c924683ed08d2d97dd4a5256dbd6658d5cfe552e5636acb3ce0" Apr 28 19:52:24.530702 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:24.530625 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" podStartSLOduration=6.530608755 podStartE2EDuration="6.530608755s" podCreationTimestamp="2026-04-28 19:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:52:24.528429549 +0000 UTC m=+2170.824784913" watchObservedRunningTime="2026-04-28 19:52:24.530608755 +0000 UTC m=+2170.826964121" Apr 28 19:52:24.541220 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:24.541193 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc"] Apr 28 19:52:24.545401 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:24.545361 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-8whhc"] Apr 28 19:52:26.270706 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:26.270676 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" path="/var/lib/kubelet/pods/b20f856f-2dad-4bc0-a1b3-b87db891b6c5/volumes" Apr 28 19:52:30.507015 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:52:30.506985 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" Apr 28 19:53:00.507859 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:00.507818 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" podUID="d873ed12-cd11-498a-8ff7-98d4bce6013c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.28:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.28:8080: connect: connection refused" Apr 28 19:53:10.508090 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:10.508044 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" podUID="d873ed12-cd11-498a-8ff7-98d4bce6013c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.28:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.28:8080: connect: connection refused" Apr 28 19:53:20.507639 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:20.507597 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" podUID="d873ed12-cd11-498a-8ff7-98d4bce6013c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.28:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.28:8080: connect: connection refused" Apr 28 19:53:30.507984 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:30.507896 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" podUID="d873ed12-cd11-498a-8ff7-98d4bce6013c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.28:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.28:8080: connect: connection refused" Apr 28 19:53:40.511936 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:40.511900 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" Apr 28 19:53:48.673709 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:48.673679 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9"] Apr 28 19:53:48.674156 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:48.673977 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" podUID="d873ed12-cd11-498a-8ff7-98d4bce6013c" containerName="kserve-container" containerID="cri-o://025feb9c12fb46364b3def8e9dfb8771f4d86c71ec3638c7a06e4d9fb29c20a4" gracePeriod=30 Apr 28 19:53:48.674156 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:48.674043 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" podUID="d873ed12-cd11-498a-8ff7-98d4bce6013c" containerName="kube-rbac-proxy" containerID="cri-o://c4d98398d44d7f895f7dc118b738a75dd6dc74ebbe535fc21637c60305c119b5" gracePeriod=30 Apr 28 19:53:48.888823 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:48.888789 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6"] Apr 28 19:53:48.889171 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:48.889154 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerName="storage-initializer" Apr 28 19:53:48.889284 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:48.889174 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerName="storage-initializer" Apr 28 19:53:48.889284 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:48.889185 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerName="kserve-container" Apr 28 19:53:48.889284 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:48.889193 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerName="kserve-container" Apr 28 19:53:48.889284 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:48.889203 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerName="kube-rbac-proxy" Apr 28 19:53:48.889284 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:48.889211 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerName="kube-rbac-proxy" Apr 28 19:53:48.889563 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:48.889308 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerName="kserve-container" Apr 28 19:53:48.889563 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:48.889323 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="b20f856f-2dad-4bc0-a1b3-b87db891b6c5" containerName="kube-rbac-proxy" Apr 28 19:53:48.892619 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:48.892598 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" Apr 28 19:53:48.894588 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:48.894561 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-predictor-serving-cert\"" Apr 28 19:53:48.894698 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:48.894572 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 28 19:53:48.903129 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:48.903109 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6"] Apr 28 19:53:48.975875 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:48.975788 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6\" (UID: \"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" Apr 28 19:53:48.975875 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:48.975852 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6\" (UID: \"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" Apr 28 19:53:48.976105 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:48.975925 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6\" (UID: \"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" Apr 28 19:53:48.976105 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:48.975969 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wzfx\" (UniqueName: \"kubernetes.io/projected/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-kube-api-access-6wzfx\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6\" (UID: \"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" Apr 28 19:53:49.077316 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:49.077269 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6\" (UID: \"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" Apr 28 19:53:49.077545 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:49.077344 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6\" (UID: \"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" Apr 28 19:53:49.077545 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:49.077404 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wzfx\" (UniqueName: \"kubernetes.io/projected/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-kube-api-access-6wzfx\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6\" (UID: \"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" Apr 28 19:53:49.077545 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:49.077481 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6\" (UID: \"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" Apr 28 19:53:49.077831 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:49.077798 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6\" (UID: \"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" Apr 28 19:53:49.078172 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:49.078151 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6\" (UID: \"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" Apr 28 19:53:49.079981 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:49.079955 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6\" (UID: \"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" Apr 28 19:53:49.084769 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:49.084745 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wzfx\" (UniqueName: \"kubernetes.io/projected/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-kube-api-access-6wzfx\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6\" (UID: \"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" Apr 28 19:53:49.202187 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:49.202153 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" Apr 28 19:53:49.322040 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:49.321919 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6"] Apr 28 19:53:49.324669 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:53:49.324646 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca98e797_cfae_4f7b_b8fc_7c1bfb7475cb.slice/crio-4fec49f3686b4f6ff1dcca850575c49d14d9a56309a83c4257c9e8a16c4a3722 WatchSource:0}: Error finding container 4fec49f3686b4f6ff1dcca850575c49d14d9a56309a83c4257c9e8a16c4a3722: Status 404 returned error can't find the container with id 4fec49f3686b4f6ff1dcca850575c49d14d9a56309a83c4257c9e8a16c4a3722 Apr 28 19:53:49.732534 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:49.732447 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" event={"ID":"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb","Type":"ContainerStarted","Data":"2c9bf06e1e5af5cd49081f41243f2fc899c30f99a2b980e892966be18829097f"} Apr 28 19:53:49.732534 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:49.732495 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" event={"ID":"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb","Type":"ContainerStarted","Data":"4fec49f3686b4f6ff1dcca850575c49d14d9a56309a83c4257c9e8a16c4a3722"} Apr 28 19:53:49.734426 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:49.734403 2539 generic.go:358] "Generic (PLEG): container finished" podID="d873ed12-cd11-498a-8ff7-98d4bce6013c" containerID="c4d98398d44d7f895f7dc118b738a75dd6dc74ebbe535fc21637c60305c119b5" exitCode=2 Apr 28 19:53:49.734537 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:49.734440 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" event={"ID":"d873ed12-cd11-498a-8ff7-98d4bce6013c","Type":"ContainerDied","Data":"c4d98398d44d7f895f7dc118b738a75dd6dc74ebbe535fc21637c60305c119b5"} Apr 28 19:53:50.502249 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:50.502196 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" podUID="d873ed12-cd11-498a-8ff7-98d4bce6013c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.28:8643/healthz\": dial tcp 10.133.0.28:8643: connect: connection refused" Apr 28 19:53:50.507966 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:50.507934 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" podUID="d873ed12-cd11-498a-8ff7-98d4bce6013c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.28:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.28:8080: connect: connection refused" Apr 28 19:53:53.714851 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.714825 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" Apr 28 19:53:53.746662 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.746624 2539 generic.go:358] "Generic (PLEG): container finished" podID="d873ed12-cd11-498a-8ff7-98d4bce6013c" containerID="025feb9c12fb46364b3def8e9dfb8771f4d86c71ec3638c7a06e4d9fb29c20a4" exitCode=0 Apr 28 19:53:53.746842 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.746707 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" event={"ID":"d873ed12-cd11-498a-8ff7-98d4bce6013c","Type":"ContainerDied","Data":"025feb9c12fb46364b3def8e9dfb8771f4d86c71ec3638c7a06e4d9fb29c20a4"} Apr 28 19:53:53.746842 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.746745 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" event={"ID":"d873ed12-cd11-498a-8ff7-98d4bce6013c","Type":"ContainerDied","Data":"b3e93c4e0e4ed0cf0b573b7152bc60d0149e83d7e63d4d46f33d1709a3f74a1e"} Apr 28 19:53:53.746842 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.746765 2539 scope.go:117] "RemoveContainer" containerID="c4d98398d44d7f895f7dc118b738a75dd6dc74ebbe535fc21637c60305c119b5" Apr 28 19:53:53.746842 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.746717 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9" Apr 28 19:53:53.748173 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.748156 2539 generic.go:358] "Generic (PLEG): container finished" podID="ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" containerID="2c9bf06e1e5af5cd49081f41243f2fc899c30f99a2b980e892966be18829097f" exitCode=0 Apr 28 19:53:53.748262 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.748219 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" event={"ID":"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb","Type":"ContainerDied","Data":"2c9bf06e1e5af5cd49081f41243f2fc899c30f99a2b980e892966be18829097f"} Apr 28 19:53:53.754942 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.754928 2539 scope.go:117] "RemoveContainer" containerID="025feb9c12fb46364b3def8e9dfb8771f4d86c71ec3638c7a06e4d9fb29c20a4" Apr 28 19:53:53.761905 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.761886 2539 scope.go:117] "RemoveContainer" containerID="cf66230219e0300e924eb072ccde1d9db7c4fe1ef96cdfc31462ee9efd4df1d1" Apr 28 19:53:53.775622 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.775601 2539 scope.go:117] "RemoveContainer" containerID="c4d98398d44d7f895f7dc118b738a75dd6dc74ebbe535fc21637c60305c119b5" Apr 28 19:53:53.775927 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:53:53.775907 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4d98398d44d7f895f7dc118b738a75dd6dc74ebbe535fc21637c60305c119b5\": container with ID starting with c4d98398d44d7f895f7dc118b738a75dd6dc74ebbe535fc21637c60305c119b5 not found: ID does not exist" containerID="c4d98398d44d7f895f7dc118b738a75dd6dc74ebbe535fc21637c60305c119b5" Apr 28 19:53:53.775989 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.775942 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4d98398d44d7f895f7dc118b738a75dd6dc74ebbe535fc21637c60305c119b5"} err="failed to get container status \"c4d98398d44d7f895f7dc118b738a75dd6dc74ebbe535fc21637c60305c119b5\": rpc error: code = NotFound desc = could not find container \"c4d98398d44d7f895f7dc118b738a75dd6dc74ebbe535fc21637c60305c119b5\": container with ID starting with c4d98398d44d7f895f7dc118b738a75dd6dc74ebbe535fc21637c60305c119b5 not found: ID does not exist" Apr 28 19:53:53.775989 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.775965 2539 scope.go:117] "RemoveContainer" containerID="025feb9c12fb46364b3def8e9dfb8771f4d86c71ec3638c7a06e4d9fb29c20a4" Apr 28 19:53:53.776220 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:53:53.776203 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"025feb9c12fb46364b3def8e9dfb8771f4d86c71ec3638c7a06e4d9fb29c20a4\": container with ID starting with 025feb9c12fb46364b3def8e9dfb8771f4d86c71ec3638c7a06e4d9fb29c20a4 not found: ID does not exist" containerID="025feb9c12fb46364b3def8e9dfb8771f4d86c71ec3638c7a06e4d9fb29c20a4" Apr 28 19:53:53.776270 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.776227 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"025feb9c12fb46364b3def8e9dfb8771f4d86c71ec3638c7a06e4d9fb29c20a4"} err="failed to get container status \"025feb9c12fb46364b3def8e9dfb8771f4d86c71ec3638c7a06e4d9fb29c20a4\": rpc error: code = NotFound desc = could not find container \"025feb9c12fb46364b3def8e9dfb8771f4d86c71ec3638c7a06e4d9fb29c20a4\": container with ID starting with 025feb9c12fb46364b3def8e9dfb8771f4d86c71ec3638c7a06e4d9fb29c20a4 not found: ID does not exist" Apr 28 19:53:53.776270 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.776243 2539 scope.go:117] "RemoveContainer" containerID="cf66230219e0300e924eb072ccde1d9db7c4fe1ef96cdfc31462ee9efd4df1d1" Apr 28 19:53:53.776528 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:53:53.776513 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf66230219e0300e924eb072ccde1d9db7c4fe1ef96cdfc31462ee9efd4df1d1\": container with ID starting with cf66230219e0300e924eb072ccde1d9db7c4fe1ef96cdfc31462ee9efd4df1d1 not found: ID does not exist" containerID="cf66230219e0300e924eb072ccde1d9db7c4fe1ef96cdfc31462ee9efd4df1d1" Apr 28 19:53:53.776603 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.776533 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf66230219e0300e924eb072ccde1d9db7c4fe1ef96cdfc31462ee9efd4df1d1"} err="failed to get container status \"cf66230219e0300e924eb072ccde1d9db7c4fe1ef96cdfc31462ee9efd4df1d1\": rpc error: code = NotFound desc = could not find container \"cf66230219e0300e924eb072ccde1d9db7c4fe1ef96cdfc31462ee9efd4df1d1\": container with ID starting with cf66230219e0300e924eb072ccde1d9db7c4fe1ef96cdfc31462ee9efd4df1d1 not found: ID does not exist" Apr 28 19:53:53.818776 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.818755 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d873ed12-cd11-498a-8ff7-98d4bce6013c-proxy-tls\") pod \"d873ed12-cd11-498a-8ff7-98d4bce6013c\" (UID: \"d873ed12-cd11-498a-8ff7-98d4bce6013c\") " Apr 28 19:53:53.818857 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.818803 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w44w8\" (UniqueName: \"kubernetes.io/projected/d873ed12-cd11-498a-8ff7-98d4bce6013c-kube-api-access-w44w8\") pod \"d873ed12-cd11-498a-8ff7-98d4bce6013c\" (UID: \"d873ed12-cd11-498a-8ff7-98d4bce6013c\") " Apr 28 19:53:53.818857 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.818827 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d873ed12-cd11-498a-8ff7-98d4bce6013c-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"d873ed12-cd11-498a-8ff7-98d4bce6013c\" (UID: \"d873ed12-cd11-498a-8ff7-98d4bce6013c\") " Apr 28 19:53:53.818857 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.818851 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d873ed12-cd11-498a-8ff7-98d4bce6013c-kserve-provision-location\") pod \"d873ed12-cd11-498a-8ff7-98d4bce6013c\" (UID: \"d873ed12-cd11-498a-8ff7-98d4bce6013c\") " Apr 28 19:53:53.819189 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.819159 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d873ed12-cd11-498a-8ff7-98d4bce6013c-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config") pod "d873ed12-cd11-498a-8ff7-98d4bce6013c" (UID: "d873ed12-cd11-498a-8ff7-98d4bce6013c"). InnerVolumeSpecName "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:53:53.819278 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.819184 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d873ed12-cd11-498a-8ff7-98d4bce6013c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d873ed12-cd11-498a-8ff7-98d4bce6013c" (UID: "d873ed12-cd11-498a-8ff7-98d4bce6013c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:53:53.820974 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.820947 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d873ed12-cd11-498a-8ff7-98d4bce6013c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d873ed12-cd11-498a-8ff7-98d4bce6013c" (UID: "d873ed12-cd11-498a-8ff7-98d4bce6013c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:53:53.821248 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.821228 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d873ed12-cd11-498a-8ff7-98d4bce6013c-kube-api-access-w44w8" (OuterVolumeSpecName: "kube-api-access-w44w8") pod "d873ed12-cd11-498a-8ff7-98d4bce6013c" (UID: "d873ed12-cd11-498a-8ff7-98d4bce6013c"). InnerVolumeSpecName "kube-api-access-w44w8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:53:53.919596 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.919558 2539 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d873ed12-cd11-498a-8ff7-98d4bce6013c-proxy-tls\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:53:53.919596 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.919588 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w44w8\" (UniqueName: \"kubernetes.io/projected/d873ed12-cd11-498a-8ff7-98d4bce6013c-kube-api-access-w44w8\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:53:53.919596 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.919599 2539 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d873ed12-cd11-498a-8ff7-98d4bce6013c-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:53:53.919836 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:53.919610 2539 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d873ed12-cd11-498a-8ff7-98d4bce6013c-kserve-provision-location\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:53:54.067720 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:54.067690 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9"] Apr 28 19:53:54.071318 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:54.071295 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-5l4f9"] Apr 28 19:53:54.275189 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:54.275158 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d873ed12-cd11-498a-8ff7-98d4bce6013c" path="/var/lib/kubelet/pods/d873ed12-cd11-498a-8ff7-98d4bce6013c/volumes" Apr 28 19:53:54.752891 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:54.752854 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" event={"ID":"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb","Type":"ContainerStarted","Data":"bc526e3e26a620fd3bdcd95b04997cdd74a9f77e518e21fc1f8ae7c40a3a992e"} Apr 28 19:53:54.752891 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:54.752897 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" event={"ID":"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb","Type":"ContainerStarted","Data":"0222e80bd21c881579b13dfccb61c1a07bbf56ea5f6df4aac5c56f235adb4789"} Apr 28 19:53:54.753418 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:54.753261 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" Apr 28 19:53:54.753418 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:54.753298 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" Apr 28 19:53:54.771460 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:53:54.771411 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" podStartSLOduration=6.771396404 podStartE2EDuration="6.771396404s" podCreationTimestamp="2026-04-28 19:53:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:53:54.769428689 +0000 UTC m=+2261.065784052" watchObservedRunningTime="2026-04-28 19:53:54.771396404 +0000 UTC m=+2261.067751761" Apr 28 19:54:00.763696 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:54:00.763659 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" Apr 28 19:54:30.764390 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:54:30.764327 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" podUID="ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.29:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.29:8080: connect: connection refused" Apr 28 19:54:40.765115 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:54:40.765078 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" podUID="ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.29:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.29:8080: connect: connection refused" Apr 28 19:54:50.765118 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:54:50.765079 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" podUID="ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.29:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.29:8080: connect: connection refused" Apr 28 19:55:00.764762 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:00.764672 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" podUID="ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.29:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.29:8080: connect: connection refused" Apr 28 19:55:10.767602 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:10.767569 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" Apr 28 19:55:19.019710 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.019675 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6"] Apr 28 19:55:19.020162 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.019970 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" podUID="ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" containerName="kserve-container" containerID="cri-o://0222e80bd21c881579b13dfccb61c1a07bbf56ea5f6df4aac5c56f235adb4789" gracePeriod=30 Apr 28 19:55:19.020162 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.020000 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" podUID="ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" containerName="kube-rbac-proxy" containerID="cri-o://bc526e3e26a620fd3bdcd95b04997cdd74a9f77e518e21fc1f8ae7c40a3a992e" gracePeriod=30 Apr 28 19:55:19.238158 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.238115 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz"] Apr 28 19:55:19.238520 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.238502 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d873ed12-cd11-498a-8ff7-98d4bce6013c" containerName="storage-initializer" Apr 28 19:55:19.238615 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.238521 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="d873ed12-cd11-498a-8ff7-98d4bce6013c" containerName="storage-initializer" Apr 28 19:55:19.238615 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.238540 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d873ed12-cd11-498a-8ff7-98d4bce6013c" containerName="kube-rbac-proxy" Apr 28 19:55:19.238615 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.238549 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="d873ed12-cd11-498a-8ff7-98d4bce6013c" containerName="kube-rbac-proxy" Apr 28 19:55:19.238615 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.238564 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d873ed12-cd11-498a-8ff7-98d4bce6013c" containerName="kserve-container" Apr 28 19:55:19.238615 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.238572 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="d873ed12-cd11-498a-8ff7-98d4bce6013c" containerName="kserve-container" Apr 28 19:55:19.238866 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.238646 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="d873ed12-cd11-498a-8ff7-98d4bce6013c" containerName="kserve-container" Apr 28 19:55:19.238866 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.238658 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="d873ed12-cd11-498a-8ff7-98d4bce6013c" containerName="kube-rbac-proxy" Apr 28 19:55:19.241933 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.241912 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" Apr 28 19:55:19.244013 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.243993 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\"" Apr 28 19:55:19.244114 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.244044 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-predictor-serving-cert\"" Apr 28 19:55:19.245154 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.245134 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp2wz\" (UniqueName: \"kubernetes.io/projected/f1625546-f1a3-4c8e-9f98-065e66e22c19-kube-api-access-jp2wz\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz\" (UID: \"f1625546-f1a3-4c8e-9f98-065e66e22c19\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" Apr 28 19:55:19.245215 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.245199 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1625546-f1a3-4c8e-9f98-065e66e22c19-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz\" (UID: \"f1625546-f1a3-4c8e-9f98-065e66e22c19\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" Apr 28 19:55:19.245270 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.245254 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1625546-f1a3-4c8e-9f98-065e66e22c19-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz\" (UID: \"f1625546-f1a3-4c8e-9f98-065e66e22c19\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" Apr 28 19:55:19.245332 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.245316 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1625546-f1a3-4c8e-9f98-065e66e22c19-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz\" (UID: \"f1625546-f1a3-4c8e-9f98-065e66e22c19\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" Apr 28 19:55:19.250111 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.250088 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz"] Apr 28 19:55:19.346400 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.346292 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1625546-f1a3-4c8e-9f98-065e66e22c19-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz\" (UID: \"f1625546-f1a3-4c8e-9f98-065e66e22c19\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" Apr 28 19:55:19.346400 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.346345 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1625546-f1a3-4c8e-9f98-065e66e22c19-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz\" (UID: \"f1625546-f1a3-4c8e-9f98-065e66e22c19\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" Apr 28 19:55:19.346400 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.346395 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jp2wz\" (UniqueName: \"kubernetes.io/projected/f1625546-f1a3-4c8e-9f98-065e66e22c19-kube-api-access-jp2wz\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz\" (UID: \"f1625546-f1a3-4c8e-9f98-065e66e22c19\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" Apr 28 19:55:19.346662 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.346429 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1625546-f1a3-4c8e-9f98-065e66e22c19-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz\" (UID: \"f1625546-f1a3-4c8e-9f98-065e66e22c19\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" Apr 28 19:55:19.346856 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.346837 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1625546-f1a3-4c8e-9f98-065e66e22c19-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz\" (UID: \"f1625546-f1a3-4c8e-9f98-065e66e22c19\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" Apr 28 19:55:19.346963 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.346944 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1625546-f1a3-4c8e-9f98-065e66e22c19-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz\" (UID: \"f1625546-f1a3-4c8e-9f98-065e66e22c19\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" Apr 28 19:55:19.348762 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.348733 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1625546-f1a3-4c8e-9f98-065e66e22c19-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz\" (UID: \"f1625546-f1a3-4c8e-9f98-065e66e22c19\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" Apr 28 19:55:19.353492 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.353471 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp2wz\" (UniqueName: \"kubernetes.io/projected/f1625546-f1a3-4c8e-9f98-065e66e22c19-kube-api-access-jp2wz\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz\" (UID: \"f1625546-f1a3-4c8e-9f98-065e66e22c19\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" Apr 28 19:55:19.553364 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.553327 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" Apr 28 19:55:19.686918 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.686877 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz"] Apr 28 19:55:19.690317 ip-10-0-143-206 kubenswrapper[2539]: W0428 19:55:19.690290 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1625546_f1a3_4c8e_9f98_065e66e22c19.slice/crio-3b1ec56535b7de2fda625d0ec002046f512aeb45fc98347415541ecb0e789dcb WatchSource:0}: Error finding container 3b1ec56535b7de2fda625d0ec002046f512aeb45fc98347415541ecb0e789dcb: Status 404 returned error can't find the container with id 3b1ec56535b7de2fda625d0ec002046f512aeb45fc98347415541ecb0e789dcb Apr 28 19:55:19.992245 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.992151 2539 generic.go:358] "Generic (PLEG): container finished" podID="ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" containerID="bc526e3e26a620fd3bdcd95b04997cdd74a9f77e518e21fc1f8ae7c40a3a992e" exitCode=2 Apr 28 19:55:19.992245 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.992228 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" event={"ID":"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb","Type":"ContainerDied","Data":"bc526e3e26a620fd3bdcd95b04997cdd74a9f77e518e21fc1f8ae7c40a3a992e"} Apr 28 19:55:19.993672 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.993642 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" event={"ID":"f1625546-f1a3-4c8e-9f98-065e66e22c19","Type":"ContainerStarted","Data":"e57184524694e4de9830b677811884ca5fc524aed6b5b787fb3ab30bcb3b4988"} Apr 28 19:55:19.993793 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:19.993679 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" event={"ID":"f1625546-f1a3-4c8e-9f98-065e66e22c19","Type":"ContainerStarted","Data":"3b1ec56535b7de2fda625d0ec002046f512aeb45fc98347415541ecb0e789dcb"} Apr 28 19:55:20.758425 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:20.758356 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" podUID="ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.29:8643/healthz\": dial tcp 10.133.0.29:8643: connect: connection refused" Apr 28 19:55:20.764961 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:20.764920 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" podUID="ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.29:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.29:8080: connect: connection refused" Apr 28 19:55:23.959592 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:23.959571 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" Apr 28 19:55:23.981263 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:23.981233 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-proxy-tls\") pod \"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb\" (UID: \"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb\") " Apr 28 19:55:23.981445 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:23.981314 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-kserve-provision-location\") pod \"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb\" (UID: \"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb\") " Apr 28 19:55:23.981445 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:23.981349 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wzfx\" (UniqueName: \"kubernetes.io/projected/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-kube-api-access-6wzfx\") pod \"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb\" (UID: \"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb\") " Apr 28 19:55:23.981445 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:23.981414 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb\" (UID: \"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb\") " Apr 28 19:55:23.981750 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:23.981672 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" (UID: "ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:55:23.982056 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:23.982030 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config") pod "ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" (UID: "ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb"). InnerVolumeSpecName "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:55:23.983774 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:23.983751 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" (UID: "ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:55:23.983875 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:23.983854 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-kube-api-access-6wzfx" (OuterVolumeSpecName: "kube-api-access-6wzfx") pod "ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" (UID: "ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb"). InnerVolumeSpecName "kube-api-access-6wzfx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:55:24.006279 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:24.006242 2539 generic.go:358] "Generic (PLEG): container finished" podID="ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" containerID="0222e80bd21c881579b13dfccb61c1a07bbf56ea5f6df4aac5c56f235adb4789" exitCode=0 Apr 28 19:55:24.006449 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:24.006303 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" event={"ID":"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb","Type":"ContainerDied","Data":"0222e80bd21c881579b13dfccb61c1a07bbf56ea5f6df4aac5c56f235adb4789"} Apr 28 19:55:24.006449 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:24.006340 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" Apr 28 19:55:24.006449 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:24.006350 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6" event={"ID":"ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb","Type":"ContainerDied","Data":"4fec49f3686b4f6ff1dcca850575c49d14d9a56309a83c4257c9e8a16c4a3722"} Apr 28 19:55:24.006449 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:24.006367 2539 scope.go:117] "RemoveContainer" containerID="bc526e3e26a620fd3bdcd95b04997cdd74a9f77e518e21fc1f8ae7c40a3a992e" Apr 28 19:55:24.007853 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:24.007832 2539 generic.go:358] "Generic (PLEG): container finished" podID="f1625546-f1a3-4c8e-9f98-065e66e22c19" containerID="e57184524694e4de9830b677811884ca5fc524aed6b5b787fb3ab30bcb3b4988" exitCode=0 Apr 28 19:55:24.007965 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:24.007870 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" event={"ID":"f1625546-f1a3-4c8e-9f98-065e66e22c19","Type":"ContainerDied","Data":"e57184524694e4de9830b677811884ca5fc524aed6b5b787fb3ab30bcb3b4988"} Apr 28 19:55:24.015574 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:24.015338 2539 scope.go:117] "RemoveContainer" containerID="0222e80bd21c881579b13dfccb61c1a07bbf56ea5f6df4aac5c56f235adb4789" Apr 28 19:55:24.022721 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:24.022697 2539 scope.go:117] "RemoveContainer" containerID="2c9bf06e1e5af5cd49081f41243f2fc899c30f99a2b980e892966be18829097f" Apr 28 19:55:24.030839 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:24.030817 2539 scope.go:117] "RemoveContainer" containerID="bc526e3e26a620fd3bdcd95b04997cdd74a9f77e518e21fc1f8ae7c40a3a992e" Apr 28 19:55:24.031113 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:55:24.031093 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc526e3e26a620fd3bdcd95b04997cdd74a9f77e518e21fc1f8ae7c40a3a992e\": container with ID starting with bc526e3e26a620fd3bdcd95b04997cdd74a9f77e518e21fc1f8ae7c40a3a992e not found: ID does not exist" containerID="bc526e3e26a620fd3bdcd95b04997cdd74a9f77e518e21fc1f8ae7c40a3a992e" Apr 28 19:55:24.031203 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:24.031127 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc526e3e26a620fd3bdcd95b04997cdd74a9f77e518e21fc1f8ae7c40a3a992e"} err="failed to get container status \"bc526e3e26a620fd3bdcd95b04997cdd74a9f77e518e21fc1f8ae7c40a3a992e\": rpc error: code = NotFound desc = could not find container \"bc526e3e26a620fd3bdcd95b04997cdd74a9f77e518e21fc1f8ae7c40a3a992e\": container with ID starting with bc526e3e26a620fd3bdcd95b04997cdd74a9f77e518e21fc1f8ae7c40a3a992e not found: ID does not exist" Apr 28 19:55:24.031203 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:24.031153 2539 scope.go:117] "RemoveContainer" containerID="0222e80bd21c881579b13dfccb61c1a07bbf56ea5f6df4aac5c56f235adb4789" Apr 28 19:55:24.031467 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:55:24.031448 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0222e80bd21c881579b13dfccb61c1a07bbf56ea5f6df4aac5c56f235adb4789\": container with ID starting with 0222e80bd21c881579b13dfccb61c1a07bbf56ea5f6df4aac5c56f235adb4789 not found: ID does not exist" containerID="0222e80bd21c881579b13dfccb61c1a07bbf56ea5f6df4aac5c56f235adb4789" Apr 28 19:55:24.031548 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:24.031477 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0222e80bd21c881579b13dfccb61c1a07bbf56ea5f6df4aac5c56f235adb4789"} err="failed to get container status \"0222e80bd21c881579b13dfccb61c1a07bbf56ea5f6df4aac5c56f235adb4789\": rpc error: code = NotFound desc = could not find container \"0222e80bd21c881579b13dfccb61c1a07bbf56ea5f6df4aac5c56f235adb4789\": container with ID starting with 0222e80bd21c881579b13dfccb61c1a07bbf56ea5f6df4aac5c56f235adb4789 not found: ID does not exist" Apr 28 19:55:24.031548 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:24.031502 2539 scope.go:117] "RemoveContainer" containerID="2c9bf06e1e5af5cd49081f41243f2fc899c30f99a2b980e892966be18829097f" Apr 28 19:55:24.031758 ip-10-0-143-206 kubenswrapper[2539]: E0428 19:55:24.031738 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c9bf06e1e5af5cd49081f41243f2fc899c30f99a2b980e892966be18829097f\": container with ID starting with 2c9bf06e1e5af5cd49081f41243f2fc899c30f99a2b980e892966be18829097f not found: ID does not exist" containerID="2c9bf06e1e5af5cd49081f41243f2fc899c30f99a2b980e892966be18829097f" Apr 28 19:55:24.031801 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:24.031764 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c9bf06e1e5af5cd49081f41243f2fc899c30f99a2b980e892966be18829097f"} err="failed to get container status \"2c9bf06e1e5af5cd49081f41243f2fc899c30f99a2b980e892966be18829097f\": rpc error: code = NotFound desc = could not find container \"2c9bf06e1e5af5cd49081f41243f2fc899c30f99a2b980e892966be18829097f\": container with ID starting with 2c9bf06e1e5af5cd49081f41243f2fc899c30f99a2b980e892966be18829097f not found: ID does not exist" Apr 28 19:55:24.039101 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:24.039075 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6"] Apr 28 19:55:24.042922 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:24.042900 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-svdg6"] Apr 28 19:55:24.083063 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:24.082948 2539 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:55:24.083063 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:24.082973 2539 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-proxy-tls\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:55:24.083063 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:24.082989 2539 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-kserve-provision-location\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:55:24.083063 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:24.083003 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6wzfx\" (UniqueName: \"kubernetes.io/projected/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb-kube-api-access-6wzfx\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:55:24.272419 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:24.272391 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" path="/var/lib/kubelet/pods/ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb/volumes" Apr 28 19:55:25.013667 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:25.013638 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" event={"ID":"f1625546-f1a3-4c8e-9f98-065e66e22c19","Type":"ContainerStarted","Data":"92ca8a2231bf6f8b89c77f748fcb6f7c5019137979b131c6ceef20ccb7a8d547"} Apr 28 19:55:25.013667 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:25.013669 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" event={"ID":"f1625546-f1a3-4c8e-9f98-065e66e22c19","Type":"ContainerStarted","Data":"2d093c8dabc4382da893cd4a9c1e4db3c170ff7cc16e106ffaa371891cdcca52"} Apr 28 19:55:25.014122 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:25.013870 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" Apr 28 19:55:25.033361 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:25.033313 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" podStartSLOduration=6.033300011 podStartE2EDuration="6.033300011s" podCreationTimestamp="2026-04-28 19:55:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:55:25.031414917 +0000 UTC m=+2351.327770278" watchObservedRunningTime="2026-04-28 19:55:25.033300011 +0000 UTC m=+2351.329655375" Apr 28 19:55:26.016781 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:26.016749 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" Apr 28 19:55:32.024951 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:55:32.024914 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" Apr 28 19:56:02.026471 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:02.026431 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" podUID="f1625546-f1a3-4c8e-9f98-065e66e22c19" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.30:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.30:8080: connect: connection refused" Apr 28 19:56:12.025911 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:12.025866 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" podUID="f1625546-f1a3-4c8e-9f98-065e66e22c19" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.30:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.30:8080: connect: connection refused" Apr 28 19:56:14.325204 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:14.325174 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 19:56:14.329320 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:14.329291 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 19:56:22.026525 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:22.026475 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" podUID="f1625546-f1a3-4c8e-9f98-065e66e22c19" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.30:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.30:8080: connect: connection refused" Apr 28 19:56:32.026363 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:32.026277 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" podUID="f1625546-f1a3-4c8e-9f98-065e66e22c19" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.30:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.30:8080: connect: connection refused" Apr 28 19:56:42.028673 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:42.028644 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" Apr 28 19:56:49.066240 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:49.066203 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz"] Apr 28 19:56:49.066815 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:49.066629 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" podUID="f1625546-f1a3-4c8e-9f98-065e66e22c19" containerName="kserve-container" containerID="cri-o://2d093c8dabc4382da893cd4a9c1e4db3c170ff7cc16e106ffaa371891cdcca52" gracePeriod=30 Apr 28 19:56:49.066815 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:49.066711 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" podUID="f1625546-f1a3-4c8e-9f98-065e66e22c19" containerName="kube-rbac-proxy" containerID="cri-o://92ca8a2231bf6f8b89c77f748fcb6f7c5019137979b131c6ceef20ccb7a8d547" gracePeriod=30 Apr 28 19:56:49.256840 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:49.256806 2539 generic.go:358] "Generic (PLEG): container finished" podID="f1625546-f1a3-4c8e-9f98-065e66e22c19" containerID="92ca8a2231bf6f8b89c77f748fcb6f7c5019137979b131c6ceef20ccb7a8d547" exitCode=2 Apr 28 19:56:49.257002 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:49.256876 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" event={"ID":"f1625546-f1a3-4c8e-9f98-065e66e22c19","Type":"ContainerDied","Data":"92ca8a2231bf6f8b89c77f748fcb6f7c5019137979b131c6ceef20ccb7a8d547"} Apr 28 19:56:52.020725 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:52.020680 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" podUID="f1625546-f1a3-4c8e-9f98-065e66e22c19" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.30:8643/healthz\": dial tcp 10.133.0.30:8643: connect: connection refused" Apr 28 19:56:52.025537 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:52.025504 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" podUID="f1625546-f1a3-4c8e-9f98-065e66e22c19" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.30:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.30:8080: connect: connection refused" Apr 28 19:56:54.272767 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:54.272735 2539 generic.go:358] "Generic (PLEG): container finished" podID="f1625546-f1a3-4c8e-9f98-065e66e22c19" containerID="2d093c8dabc4382da893cd4a9c1e4db3c170ff7cc16e106ffaa371891cdcca52" exitCode=0 Apr 28 19:56:54.273154 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:54.272801 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" event={"ID":"f1625546-f1a3-4c8e-9f98-065e66e22c19","Type":"ContainerDied","Data":"2d093c8dabc4382da893cd4a9c1e4db3c170ff7cc16e106ffaa371891cdcca52"} Apr 28 19:56:54.508077 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:54.508051 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" Apr 28 19:56:54.638072 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:54.637987 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp2wz\" (UniqueName: \"kubernetes.io/projected/f1625546-f1a3-4c8e-9f98-065e66e22c19-kube-api-access-jp2wz\") pod \"f1625546-f1a3-4c8e-9f98-065e66e22c19\" (UID: \"f1625546-f1a3-4c8e-9f98-065e66e22c19\") " Apr 28 19:56:54.638072 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:54.638038 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1625546-f1a3-4c8e-9f98-065e66e22c19-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"f1625546-f1a3-4c8e-9f98-065e66e22c19\" (UID: \"f1625546-f1a3-4c8e-9f98-065e66e22c19\") " Apr 28 19:56:54.638262 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:54.638080 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1625546-f1a3-4c8e-9f98-065e66e22c19-kserve-provision-location\") pod \"f1625546-f1a3-4c8e-9f98-065e66e22c19\" (UID: \"f1625546-f1a3-4c8e-9f98-065e66e22c19\") " Apr 28 19:56:54.638262 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:54.638123 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1625546-f1a3-4c8e-9f98-065e66e22c19-proxy-tls\") pod \"f1625546-f1a3-4c8e-9f98-065e66e22c19\" (UID: \"f1625546-f1a3-4c8e-9f98-065e66e22c19\") " Apr 28 19:56:54.638476 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:54.638440 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1625546-f1a3-4c8e-9f98-065e66e22c19-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config") pod "f1625546-f1a3-4c8e-9f98-065e66e22c19" (UID: "f1625546-f1a3-4c8e-9f98-065e66e22c19"). InnerVolumeSpecName "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:56:54.638476 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:54.638466 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1625546-f1a3-4c8e-9f98-065e66e22c19-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f1625546-f1a3-4c8e-9f98-065e66e22c19" (UID: "f1625546-f1a3-4c8e-9f98-065e66e22c19"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:56:54.640188 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:54.640167 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1625546-f1a3-4c8e-9f98-065e66e22c19-kube-api-access-jp2wz" (OuterVolumeSpecName: "kube-api-access-jp2wz") pod "f1625546-f1a3-4c8e-9f98-065e66e22c19" (UID: "f1625546-f1a3-4c8e-9f98-065e66e22c19"). InnerVolumeSpecName "kube-api-access-jp2wz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:56:54.640274 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:54.640229 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1625546-f1a3-4c8e-9f98-065e66e22c19-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f1625546-f1a3-4c8e-9f98-065e66e22c19" (UID: "f1625546-f1a3-4c8e-9f98-065e66e22c19"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:56:54.739538 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:54.739496 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jp2wz\" (UniqueName: \"kubernetes.io/projected/f1625546-f1a3-4c8e-9f98-065e66e22c19-kube-api-access-jp2wz\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:56:54.739538 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:54.739532 2539 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1625546-f1a3-4c8e-9f98-065e66e22c19-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:56:54.739538 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:54.739545 2539 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1625546-f1a3-4c8e-9f98-065e66e22c19-kserve-provision-location\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:56:54.739778 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:54.739555 2539 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1625546-f1a3-4c8e-9f98-065e66e22c19-proxy-tls\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 19:56:55.278315 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:55.278277 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" event={"ID":"f1625546-f1a3-4c8e-9f98-065e66e22c19","Type":"ContainerDied","Data":"3b1ec56535b7de2fda625d0ec002046f512aeb45fc98347415541ecb0e789dcb"} Apr 28 19:56:55.278315 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:55.278300 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz" Apr 28 19:56:55.278315 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:55.278323 2539 scope.go:117] "RemoveContainer" containerID="92ca8a2231bf6f8b89c77f748fcb6f7c5019137979b131c6ceef20ccb7a8d547" Apr 28 19:56:55.289368 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:55.289351 2539 scope.go:117] "RemoveContainer" containerID="2d093c8dabc4382da893cd4a9c1e4db3c170ff7cc16e106ffaa371891cdcca52" Apr 28 19:56:55.296732 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:55.296710 2539 scope.go:117] "RemoveContainer" containerID="e57184524694e4de9830b677811884ca5fc524aed6b5b787fb3ab30bcb3b4988" Apr 28 19:56:55.300805 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:55.300781 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz"] Apr 28 19:56:55.305007 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:55.304988 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-ftgmz"] Apr 28 19:56:56.271720 ip-10-0-143-206 kubenswrapper[2539]: I0428 19:56:56.271687 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1625546-f1a3-4c8e-9f98-065e66e22c19" path="/var/lib/kubelet/pods/f1625546-f1a3-4c8e-9f98-065e66e22c19/volumes" Apr 28 20:01:14.344433 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:01:14.344333 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 20:01:14.349135 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:01:14.349117 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 20:03:29.489589 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.489548 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw"] Apr 28 20:03:29.490077 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.489842 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1625546-f1a3-4c8e-9f98-065e66e22c19" containerName="kserve-container" Apr 28 20:03:29.490077 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.489854 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1625546-f1a3-4c8e-9f98-065e66e22c19" containerName="kserve-container" Apr 28 20:03:29.490077 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.489862 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" containerName="kserve-container" Apr 28 20:03:29.490077 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.489868 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" containerName="kserve-container" Apr 28 20:03:29.490077 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.489876 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" containerName="storage-initializer" Apr 28 20:03:29.490077 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.489884 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" containerName="storage-initializer" Apr 28 20:03:29.490077 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.489896 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1625546-f1a3-4c8e-9f98-065e66e22c19" containerName="kube-rbac-proxy" Apr 28 20:03:29.490077 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.489902 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1625546-f1a3-4c8e-9f98-065e66e22c19" containerName="kube-rbac-proxy" Apr 28 20:03:29.490077 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.489912 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" containerName="kube-rbac-proxy" Apr 28 20:03:29.490077 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.489918 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" containerName="kube-rbac-proxy" Apr 28 20:03:29.490077 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.489925 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1625546-f1a3-4c8e-9f98-065e66e22c19" containerName="storage-initializer" Apr 28 20:03:29.490077 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.489932 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1625546-f1a3-4c8e-9f98-065e66e22c19" containerName="storage-initializer" Apr 28 20:03:29.490077 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.489984 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" containerName="kserve-container" Apr 28 20:03:29.490077 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.489994 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1625546-f1a3-4c8e-9f98-065e66e22c19" containerName="kserve-container" Apr 28 20:03:29.490077 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.490005 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1625546-f1a3-4c8e-9f98-065e66e22c19" containerName="kube-rbac-proxy" Apr 28 20:03:29.490077 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.490018 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca98e797-cfae-4f7b-b8fc-7c1bfb7475cb" containerName="kube-rbac-proxy" Apr 28 20:03:29.493298 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.493280 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" Apr 28 20:03:29.495437 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.495400 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-kube-rbac-proxy-sar-config\"" Apr 28 20:03:29.495568 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.495498 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 28 20:03:29.495687 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.495661 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 20:03:29.496142 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.496107 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-44jch\"" Apr 28 20:03:29.496242 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.496112 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-predictor-serving-cert\"" Apr 28 20:03:29.502216 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.502196 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw"] Apr 28 20:03:29.521343 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.521309 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sddfr\" (UniqueName: \"kubernetes.io/projected/8112dc5e-4550-44ee-821d-9b1de717eac2-kube-api-access-sddfr\") pod \"isvc-tensorflow-predictor-6756f669d7-wpjsw\" (UID: \"8112dc5e-4550-44ee-821d-9b1de717eac2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" Apr 28 20:03:29.521528 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.521420 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8112dc5e-4550-44ee-821d-9b1de717eac2-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-wpjsw\" (UID: \"8112dc5e-4550-44ee-821d-9b1de717eac2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" Apr 28 20:03:29.521528 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.521458 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8112dc5e-4550-44ee-821d-9b1de717eac2-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-wpjsw\" (UID: \"8112dc5e-4550-44ee-821d-9b1de717eac2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" Apr 28 20:03:29.521528 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.521490 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8112dc5e-4550-44ee-821d-9b1de717eac2-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-wpjsw\" (UID: \"8112dc5e-4550-44ee-821d-9b1de717eac2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" Apr 28 20:03:29.621973 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.621936 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sddfr\" (UniqueName: \"kubernetes.io/projected/8112dc5e-4550-44ee-821d-9b1de717eac2-kube-api-access-sddfr\") pod \"isvc-tensorflow-predictor-6756f669d7-wpjsw\" (UID: \"8112dc5e-4550-44ee-821d-9b1de717eac2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" Apr 28 20:03:29.622140 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.622003 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8112dc5e-4550-44ee-821d-9b1de717eac2-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-wpjsw\" (UID: \"8112dc5e-4550-44ee-821d-9b1de717eac2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" Apr 28 20:03:29.622140 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.622036 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8112dc5e-4550-44ee-821d-9b1de717eac2-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-wpjsw\" (UID: \"8112dc5e-4550-44ee-821d-9b1de717eac2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" Apr 28 20:03:29.622140 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.622071 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8112dc5e-4550-44ee-821d-9b1de717eac2-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-wpjsw\" (UID: \"8112dc5e-4550-44ee-821d-9b1de717eac2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" Apr 28 20:03:29.622474 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.622457 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8112dc5e-4550-44ee-821d-9b1de717eac2-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-wpjsw\" (UID: \"8112dc5e-4550-44ee-821d-9b1de717eac2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" Apr 28 20:03:29.622743 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.622722 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8112dc5e-4550-44ee-821d-9b1de717eac2-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-wpjsw\" (UID: \"8112dc5e-4550-44ee-821d-9b1de717eac2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" Apr 28 20:03:29.624634 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.624614 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8112dc5e-4550-44ee-821d-9b1de717eac2-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-wpjsw\" (UID: \"8112dc5e-4550-44ee-821d-9b1de717eac2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" Apr 28 20:03:29.630076 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.630055 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sddfr\" (UniqueName: \"kubernetes.io/projected/8112dc5e-4550-44ee-821d-9b1de717eac2-kube-api-access-sddfr\") pod \"isvc-tensorflow-predictor-6756f669d7-wpjsw\" (UID: \"8112dc5e-4550-44ee-821d-9b1de717eac2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" Apr 28 20:03:29.804942 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.804901 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" Apr 28 20:03:29.923785 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.923761 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw"] Apr 28 20:03:29.926310 ip-10-0-143-206 kubenswrapper[2539]: W0428 20:03:29.926274 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8112dc5e_4550_44ee_821d_9b1de717eac2.slice/crio-4769c8befe333440804df6c9fc2c5a3e325e74c62feb7cec21a56af25e77b3c0 WatchSource:0}: Error finding container 4769c8befe333440804df6c9fc2c5a3e325e74c62feb7cec21a56af25e77b3c0: Status 404 returned error can't find the container with id 4769c8befe333440804df6c9fc2c5a3e325e74c62feb7cec21a56af25e77b3c0 Apr 28 20:03:29.928073 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:29.928053 2539 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 20:03:30.352888 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:30.352853 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" event={"ID":"8112dc5e-4550-44ee-821d-9b1de717eac2","Type":"ContainerStarted","Data":"5fb4ab0a96353cc43de227b30eff962dfcfdbdc643713b975eb803efcb7b2536"} Apr 28 20:03:30.352888 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:30.352891 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" event={"ID":"8112dc5e-4550-44ee-821d-9b1de717eac2","Type":"ContainerStarted","Data":"4769c8befe333440804df6c9fc2c5a3e325e74c62feb7cec21a56af25e77b3c0"} Apr 28 20:03:34.365538 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:34.365506 2539 generic.go:358] "Generic (PLEG): container finished" podID="8112dc5e-4550-44ee-821d-9b1de717eac2" containerID="5fb4ab0a96353cc43de227b30eff962dfcfdbdc643713b975eb803efcb7b2536" exitCode=0 Apr 28 20:03:34.365866 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:34.365581 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" event={"ID":"8112dc5e-4550-44ee-821d-9b1de717eac2","Type":"ContainerDied","Data":"5fb4ab0a96353cc43de227b30eff962dfcfdbdc643713b975eb803efcb7b2536"} Apr 28 20:03:39.383342 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:39.383311 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" event={"ID":"8112dc5e-4550-44ee-821d-9b1de717eac2","Type":"ContainerStarted","Data":"f1e6481aa3d92c2e6d9e358bf2116b0563136f769252359a17652eabc5b66e7d"} Apr 28 20:03:39.383762 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:39.383351 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" event={"ID":"8112dc5e-4550-44ee-821d-9b1de717eac2","Type":"ContainerStarted","Data":"ac8cb91d584385db080204b2d5707afca79286f93cf6c9b7e9c9463d7930c13b"} Apr 28 20:03:39.383762 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:39.383571 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" Apr 28 20:03:39.404067 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:39.404015 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" podStartSLOduration=5.948203975 podStartE2EDuration="10.404000777s" podCreationTimestamp="2026-04-28 20:03:29 +0000 UTC" firstStartedPulling="2026-04-28 20:03:34.366782322 +0000 UTC m=+2840.663137667" lastFinishedPulling="2026-04-28 20:03:38.822579124 +0000 UTC m=+2845.118934469" observedRunningTime="2026-04-28 20:03:39.402210016 +0000 UTC m=+2845.698565379" watchObservedRunningTime="2026-04-28 20:03:39.404000777 +0000 UTC m=+2845.700356141" Apr 28 20:03:40.386918 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:40.386875 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" Apr 28 20:03:40.388200 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:40.388174 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" podUID="8112dc5e-4550-44ee-821d-9b1de717eac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 28 20:03:41.389917 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:41.389875 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" podUID="8112dc5e-4550-44ee-821d-9b1de717eac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 28 20:03:46.394437 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:46.394405 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" Apr 28 20:03:46.395007 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:46.394982 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" podUID="8112dc5e-4550-44ee-821d-9b1de717eac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 28 20:03:56.396067 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:03:56.395991 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" Apr 28 20:04:14.729324 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:14.729267 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw"] Apr 28 20:04:14.729858 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:14.729598 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" podUID="8112dc5e-4550-44ee-821d-9b1de717eac2" containerName="kserve-container" containerID="cri-o://ac8cb91d584385db080204b2d5707afca79286f93cf6c9b7e9c9463d7930c13b" gracePeriod=30 Apr 28 20:04:14.729858 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:14.729641 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" podUID="8112dc5e-4550-44ee-821d-9b1de717eac2" containerName="kube-rbac-proxy" containerID="cri-o://f1e6481aa3d92c2e6d9e358bf2116b0563136f769252359a17652eabc5b66e7d" gracePeriod=30 Apr 28 20:04:14.940465 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:14.940429 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5"] Apr 28 20:04:14.948454 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:14.943841 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" Apr 28 20:04:14.949037 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:14.949011 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\"" Apr 28 20:04:14.949162 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:14.949062 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-predictor-serving-cert\"" Apr 28 20:04:14.954207 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:14.954182 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5"] Apr 28 20:04:15.108390 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:15.108324 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7462b069-f01a-4b3b-9148-4162e29d0b0c-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5\" (UID: \"7462b069-f01a-4b3b-9148-4162e29d0b0c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" Apr 28 20:04:15.108390 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:15.108399 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7t2b\" (UniqueName: \"kubernetes.io/projected/7462b069-f01a-4b3b-9148-4162e29d0b0c-kube-api-access-q7t2b\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5\" (UID: \"7462b069-f01a-4b3b-9148-4162e29d0b0c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" Apr 28 20:04:15.108613 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:15.108420 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7462b069-f01a-4b3b-9148-4162e29d0b0c-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5\" (UID: \"7462b069-f01a-4b3b-9148-4162e29d0b0c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" Apr 28 20:04:15.108613 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:15.108446 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7462b069-f01a-4b3b-9148-4162e29d0b0c-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5\" (UID: \"7462b069-f01a-4b3b-9148-4162e29d0b0c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" Apr 28 20:04:15.209489 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:15.209451 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7t2b\" (UniqueName: \"kubernetes.io/projected/7462b069-f01a-4b3b-9148-4162e29d0b0c-kube-api-access-q7t2b\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5\" (UID: \"7462b069-f01a-4b3b-9148-4162e29d0b0c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" Apr 28 20:04:15.209489 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:15.209493 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7462b069-f01a-4b3b-9148-4162e29d0b0c-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5\" (UID: \"7462b069-f01a-4b3b-9148-4162e29d0b0c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" Apr 28 20:04:15.209765 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:15.209532 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7462b069-f01a-4b3b-9148-4162e29d0b0c-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5\" (UID: \"7462b069-f01a-4b3b-9148-4162e29d0b0c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" Apr 28 20:04:15.209765 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:15.209615 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7462b069-f01a-4b3b-9148-4162e29d0b0c-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5\" (UID: \"7462b069-f01a-4b3b-9148-4162e29d0b0c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" Apr 28 20:04:15.210037 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:15.210012 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7462b069-f01a-4b3b-9148-4162e29d0b0c-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5\" (UID: \"7462b069-f01a-4b3b-9148-4162e29d0b0c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" Apr 28 20:04:15.210265 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:15.210246 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7462b069-f01a-4b3b-9148-4162e29d0b0c-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5\" (UID: \"7462b069-f01a-4b3b-9148-4162e29d0b0c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" Apr 28 20:04:15.211956 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:15.211941 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7462b069-f01a-4b3b-9148-4162e29d0b0c-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5\" (UID: \"7462b069-f01a-4b3b-9148-4162e29d0b0c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" Apr 28 20:04:15.218043 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:15.218021 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7t2b\" (UniqueName: \"kubernetes.io/projected/7462b069-f01a-4b3b-9148-4162e29d0b0c-kube-api-access-q7t2b\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5\" (UID: \"7462b069-f01a-4b3b-9148-4162e29d0b0c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" Apr 28 20:04:15.259392 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:15.259340 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" Apr 28 20:04:15.376365 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:15.376335 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5"] Apr 28 20:04:15.378217 ip-10-0-143-206 kubenswrapper[2539]: W0428 20:04:15.378190 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7462b069_f01a_4b3b_9148_4162e29d0b0c.slice/crio-8ca0cd8b6640b9797165ec0b58d544b5ecb9ad247234d2c812fcd237602e21f8 WatchSource:0}: Error finding container 8ca0cd8b6640b9797165ec0b58d544b5ecb9ad247234d2c812fcd237602e21f8: Status 404 returned error can't find the container with id 8ca0cd8b6640b9797165ec0b58d544b5ecb9ad247234d2c812fcd237602e21f8 Apr 28 20:04:15.489017 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:15.488979 2539 generic.go:358] "Generic (PLEG): container finished" podID="8112dc5e-4550-44ee-821d-9b1de717eac2" containerID="f1e6481aa3d92c2e6d9e358bf2116b0563136f769252359a17652eabc5b66e7d" exitCode=2 Apr 28 20:04:15.489205 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:15.489050 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" event={"ID":"8112dc5e-4550-44ee-821d-9b1de717eac2","Type":"ContainerDied","Data":"f1e6481aa3d92c2e6d9e358bf2116b0563136f769252359a17652eabc5b66e7d"} Apr 28 20:04:15.490504 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:15.490479 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" event={"ID":"7462b069-f01a-4b3b-9148-4162e29d0b0c","Type":"ContainerStarted","Data":"d213a98f1c80144a72314478fc40a0d430adf1b1c6c08e5fe42d3cc69106d24e"} Apr 28 20:04:15.490634 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:15.490512 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" event={"ID":"7462b069-f01a-4b3b-9148-4162e29d0b0c","Type":"ContainerStarted","Data":"8ca0cd8b6640b9797165ec0b58d544b5ecb9ad247234d2c812fcd237602e21f8"} Apr 28 20:04:16.390452 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:16.390411 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" podUID="8112dc5e-4550-44ee-821d-9b1de717eac2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.31:8643/healthz\": dial tcp 10.133.0.31:8643: connect: connection refused" Apr 28 20:04:20.505238 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:20.505202 2539 generic.go:358] "Generic (PLEG): container finished" podID="7462b069-f01a-4b3b-9148-4162e29d0b0c" containerID="d213a98f1c80144a72314478fc40a0d430adf1b1c6c08e5fe42d3cc69106d24e" exitCode=0 Apr 28 20:04:20.505644 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:20.505252 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" event={"ID":"7462b069-f01a-4b3b-9148-4162e29d0b0c","Type":"ContainerDied","Data":"d213a98f1c80144a72314478fc40a0d430adf1b1c6c08e5fe42d3cc69106d24e"} Apr 28 20:04:21.390566 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:21.390525 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" podUID="8112dc5e-4550-44ee-821d-9b1de717eac2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.31:8643/healthz\": dial tcp 10.133.0.31:8643: connect: connection refused" Apr 28 20:04:21.510392 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:21.510330 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" event={"ID":"7462b069-f01a-4b3b-9148-4162e29d0b0c","Type":"ContainerStarted","Data":"b2b4a50861a2fea39f0c39b95ed4a2361e0e611e07dd3837e1a975af78156ca0"} Apr 28 20:04:21.510882 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:21.510408 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" event={"ID":"7462b069-f01a-4b3b-9148-4162e29d0b0c","Type":"ContainerStarted","Data":"e48448aa4cd31c532bf9590b4b7425bea3bbbffd5567704850d683acdfece53a"} Apr 28 20:04:21.510882 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:21.510687 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" Apr 28 20:04:21.530577 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:21.530526 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" podStartSLOduration=7.530511155 podStartE2EDuration="7.530511155s" podCreationTimestamp="2026-04-28 20:04:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:04:21.528268989 +0000 UTC m=+2887.824624352" watchObservedRunningTime="2026-04-28 20:04:21.530511155 +0000 UTC m=+2887.826866519" Apr 28 20:04:22.512904 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:22.512873 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" Apr 28 20:04:22.514117 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:22.514088 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" podUID="7462b069-f01a-4b3b-9148-4162e29d0b0c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 28 20:04:23.516010 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:23.515967 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" podUID="7462b069-f01a-4b3b-9148-4162e29d0b0c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 28 20:04:26.390481 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:26.390438 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" podUID="8112dc5e-4550-44ee-821d-9b1de717eac2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.31:8643/healthz\": dial tcp 10.133.0.31:8643: connect: connection refused" Apr 28 20:04:26.390979 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:26.390563 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" Apr 28 20:04:28.520457 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:28.520424 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" Apr 28 20:04:28.521098 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:28.521071 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" podUID="7462b069-f01a-4b3b-9148-4162e29d0b0c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 28 20:04:31.390927 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:31.390887 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" podUID="8112dc5e-4550-44ee-821d-9b1de717eac2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.31:8643/healthz\": dial tcp 10.133.0.31:8643: connect: connection refused" Apr 28 20:04:36.391117 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:36.391073 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" podUID="8112dc5e-4550-44ee-821d-9b1de717eac2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.31:8643/healthz\": dial tcp 10.133.0.31:8643: connect: connection refused" Apr 28 20:04:38.521811 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:38.521778 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" Apr 28 20:04:41.390693 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:41.390646 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" podUID="8112dc5e-4550-44ee-821d-9b1de717eac2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.31:8643/healthz\": dial tcp 10.133.0.31:8643: connect: connection refused" Apr 28 20:04:45.370930 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.370902 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" Apr 28 20:04:45.448808 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.448773 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8112dc5e-4550-44ee-821d-9b1de717eac2-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"8112dc5e-4550-44ee-821d-9b1de717eac2\" (UID: \"8112dc5e-4550-44ee-821d-9b1de717eac2\") " Apr 28 20:04:45.448997 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.448830 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8112dc5e-4550-44ee-821d-9b1de717eac2-proxy-tls\") pod \"8112dc5e-4550-44ee-821d-9b1de717eac2\" (UID: \"8112dc5e-4550-44ee-821d-9b1de717eac2\") " Apr 28 20:04:45.448997 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.448887 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sddfr\" (UniqueName: \"kubernetes.io/projected/8112dc5e-4550-44ee-821d-9b1de717eac2-kube-api-access-sddfr\") pod \"8112dc5e-4550-44ee-821d-9b1de717eac2\" (UID: \"8112dc5e-4550-44ee-821d-9b1de717eac2\") " Apr 28 20:04:45.448997 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.448913 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8112dc5e-4550-44ee-821d-9b1de717eac2-kserve-provision-location\") pod \"8112dc5e-4550-44ee-821d-9b1de717eac2\" (UID: \"8112dc5e-4550-44ee-821d-9b1de717eac2\") " Apr 28 20:04:45.449309 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.449200 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8112dc5e-4550-44ee-821d-9b1de717eac2-isvc-tensorflow-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-kube-rbac-proxy-sar-config") pod "8112dc5e-4550-44ee-821d-9b1de717eac2" (UID: "8112dc5e-4550-44ee-821d-9b1de717eac2"). InnerVolumeSpecName "isvc-tensorflow-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:04:45.450951 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.450924 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8112dc5e-4550-44ee-821d-9b1de717eac2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8112dc5e-4550-44ee-821d-9b1de717eac2" (UID: "8112dc5e-4550-44ee-821d-9b1de717eac2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:04:45.451077 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.451055 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8112dc5e-4550-44ee-821d-9b1de717eac2-kube-api-access-sddfr" (OuterVolumeSpecName: "kube-api-access-sddfr") pod "8112dc5e-4550-44ee-821d-9b1de717eac2" (UID: "8112dc5e-4550-44ee-821d-9b1de717eac2"). InnerVolumeSpecName "kube-api-access-sddfr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:04:45.459649 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.459609 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8112dc5e-4550-44ee-821d-9b1de717eac2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8112dc5e-4550-44ee-821d-9b1de717eac2" (UID: "8112dc5e-4550-44ee-821d-9b1de717eac2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:04:45.549949 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.549854 2539 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8112dc5e-4550-44ee-821d-9b1de717eac2-isvc-tensorflow-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:04:45.549949 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.549889 2539 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8112dc5e-4550-44ee-821d-9b1de717eac2-proxy-tls\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:04:45.549949 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.549900 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sddfr\" (UniqueName: \"kubernetes.io/projected/8112dc5e-4550-44ee-821d-9b1de717eac2-kube-api-access-sddfr\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:04:45.549949 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.549912 2539 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8112dc5e-4550-44ee-821d-9b1de717eac2-kserve-provision-location\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:04:45.580442 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.580405 2539 generic.go:358] "Generic (PLEG): container finished" podID="8112dc5e-4550-44ee-821d-9b1de717eac2" containerID="ac8cb91d584385db080204b2d5707afca79286f93cf6c9b7e9c9463d7930c13b" exitCode=137 Apr 28 20:04:45.580608 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.580481 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" event={"ID":"8112dc5e-4550-44ee-821d-9b1de717eac2","Type":"ContainerDied","Data":"ac8cb91d584385db080204b2d5707afca79286f93cf6c9b7e9c9463d7930c13b"} Apr 28 20:04:45.580608 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.580512 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" event={"ID":"8112dc5e-4550-44ee-821d-9b1de717eac2","Type":"ContainerDied","Data":"4769c8befe333440804df6c9fc2c5a3e325e74c62feb7cec21a56af25e77b3c0"} Apr 28 20:04:45.580608 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.580533 2539 scope.go:117] "RemoveContainer" containerID="f1e6481aa3d92c2e6d9e358bf2116b0563136f769252359a17652eabc5b66e7d" Apr 28 20:04:45.580608 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.580547 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw" Apr 28 20:04:45.588538 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.588512 2539 scope.go:117] "RemoveContainer" containerID="ac8cb91d584385db080204b2d5707afca79286f93cf6c9b7e9c9463d7930c13b" Apr 28 20:04:45.595482 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.595458 2539 scope.go:117] "RemoveContainer" containerID="5fb4ab0a96353cc43de227b30eff962dfcfdbdc643713b975eb803efcb7b2536" Apr 28 20:04:45.601494 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.601469 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw"] Apr 28 20:04:45.602548 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.602524 2539 scope.go:117] "RemoveContainer" containerID="f1e6481aa3d92c2e6d9e358bf2116b0563136f769252359a17652eabc5b66e7d" Apr 28 20:04:45.602811 ip-10-0-143-206 kubenswrapper[2539]: E0428 20:04:45.602788 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1e6481aa3d92c2e6d9e358bf2116b0563136f769252359a17652eabc5b66e7d\": container with ID starting with f1e6481aa3d92c2e6d9e358bf2116b0563136f769252359a17652eabc5b66e7d not found: ID does not exist" containerID="f1e6481aa3d92c2e6d9e358bf2116b0563136f769252359a17652eabc5b66e7d" Apr 28 20:04:45.602895 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.602819 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1e6481aa3d92c2e6d9e358bf2116b0563136f769252359a17652eabc5b66e7d"} err="failed to get container status \"f1e6481aa3d92c2e6d9e358bf2116b0563136f769252359a17652eabc5b66e7d\": rpc error: code = NotFound desc = could not find container \"f1e6481aa3d92c2e6d9e358bf2116b0563136f769252359a17652eabc5b66e7d\": container with ID starting with f1e6481aa3d92c2e6d9e358bf2116b0563136f769252359a17652eabc5b66e7d not found: ID does not exist" Apr 28 20:04:45.602895 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.602839 2539 scope.go:117] "RemoveContainer" containerID="ac8cb91d584385db080204b2d5707afca79286f93cf6c9b7e9c9463d7930c13b" Apr 28 20:04:45.603137 ip-10-0-143-206 kubenswrapper[2539]: E0428 20:04:45.603109 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac8cb91d584385db080204b2d5707afca79286f93cf6c9b7e9c9463d7930c13b\": container with ID starting with ac8cb91d584385db080204b2d5707afca79286f93cf6c9b7e9c9463d7930c13b not found: ID does not exist" containerID="ac8cb91d584385db080204b2d5707afca79286f93cf6c9b7e9c9463d7930c13b" Apr 28 20:04:45.603193 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.603145 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac8cb91d584385db080204b2d5707afca79286f93cf6c9b7e9c9463d7930c13b"} err="failed to get container status \"ac8cb91d584385db080204b2d5707afca79286f93cf6c9b7e9c9463d7930c13b\": rpc error: code = NotFound desc = could not find container \"ac8cb91d584385db080204b2d5707afca79286f93cf6c9b7e9c9463d7930c13b\": container with ID starting with ac8cb91d584385db080204b2d5707afca79286f93cf6c9b7e9c9463d7930c13b not found: ID does not exist" Apr 28 20:04:45.603193 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.603171 2539 scope.go:117] "RemoveContainer" containerID="5fb4ab0a96353cc43de227b30eff962dfcfdbdc643713b975eb803efcb7b2536" Apr 28 20:04:45.603443 ip-10-0-143-206 kubenswrapper[2539]: E0428 20:04:45.603420 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fb4ab0a96353cc43de227b30eff962dfcfdbdc643713b975eb803efcb7b2536\": container with ID starting with 5fb4ab0a96353cc43de227b30eff962dfcfdbdc643713b975eb803efcb7b2536 not found: ID does not exist" containerID="5fb4ab0a96353cc43de227b30eff962dfcfdbdc643713b975eb803efcb7b2536" Apr 28 20:04:45.603659 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.603446 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb4ab0a96353cc43de227b30eff962dfcfdbdc643713b975eb803efcb7b2536"} err="failed to get container status \"5fb4ab0a96353cc43de227b30eff962dfcfdbdc643713b975eb803efcb7b2536\": rpc error: code = NotFound desc = could not find container \"5fb4ab0a96353cc43de227b30eff962dfcfdbdc643713b975eb803efcb7b2536\": container with ID starting with 5fb4ab0a96353cc43de227b30eff962dfcfdbdc643713b975eb803efcb7b2536 not found: ID does not exist" Apr 28 20:04:45.605547 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:45.605527 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-wpjsw"] Apr 28 20:04:46.271198 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:46.271165 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8112dc5e-4550-44ee-821d-9b1de717eac2" path="/var/lib/kubelet/pods/8112dc5e-4550-44ee-821d-9b1de717eac2/volumes" Apr 28 20:04:56.179278 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.179243 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5"] Apr 28 20:04:56.179767 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.179611 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" podUID="7462b069-f01a-4b3b-9148-4162e29d0b0c" containerName="kserve-container" containerID="cri-o://e48448aa4cd31c532bf9590b4b7425bea3bbbffd5567704850d683acdfece53a" gracePeriod=30 Apr 28 20:04:56.179767 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.179637 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" podUID="7462b069-f01a-4b3b-9148-4162e29d0b0c" containerName="kube-rbac-proxy" containerID="cri-o://b2b4a50861a2fea39f0c39b95ed4a2361e0e611e07dd3837e1a975af78156ca0" gracePeriod=30 Apr 28 20:04:56.490019 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.489936 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld"] Apr 28 20:04:56.490259 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.490246 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8112dc5e-4550-44ee-821d-9b1de717eac2" containerName="kube-rbac-proxy" Apr 28 20:04:56.490310 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.490262 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="8112dc5e-4550-44ee-821d-9b1de717eac2" containerName="kube-rbac-proxy" Apr 28 20:04:56.490310 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.490271 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8112dc5e-4550-44ee-821d-9b1de717eac2" containerName="kserve-container" Apr 28 20:04:56.490310 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.490277 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="8112dc5e-4550-44ee-821d-9b1de717eac2" containerName="kserve-container" Apr 28 20:04:56.490310 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.490287 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8112dc5e-4550-44ee-821d-9b1de717eac2" containerName="storage-initializer" Apr 28 20:04:56.490310 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.490293 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="8112dc5e-4550-44ee-821d-9b1de717eac2" containerName="storage-initializer" Apr 28 20:04:56.490505 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.490342 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="8112dc5e-4550-44ee-821d-9b1de717eac2" containerName="kserve-container" Apr 28 20:04:56.490505 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.490352 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="8112dc5e-4550-44ee-821d-9b1de717eac2" containerName="kube-rbac-proxy" Apr 28 20:04:56.493486 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.493468 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" Apr 28 20:04:56.495674 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.495650 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-predictor-serving-cert\"" Apr 28 20:04:56.495806 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.495699 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-kube-rbac-proxy-sar-config\"" Apr 28 20:04:56.509526 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.509503 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld"] Apr 28 20:04:56.618396 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.618344 2539 generic.go:358] "Generic (PLEG): container finished" podID="7462b069-f01a-4b3b-9148-4162e29d0b0c" containerID="b2b4a50861a2fea39f0c39b95ed4a2361e0e611e07dd3837e1a975af78156ca0" exitCode=2 Apr 28 20:04:56.618554 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.618408 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" event={"ID":"7462b069-f01a-4b3b-9148-4162e29d0b0c","Type":"ContainerDied","Data":"b2b4a50861a2fea39f0c39b95ed4a2361e0e611e07dd3837e1a975af78156ca0"} Apr 28 20:04:56.651017 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.650981 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkp5s\" (UniqueName: \"kubernetes.io/projected/a4fe0617-f47f-471d-a3af-2a1838349273-kube-api-access-wkp5s\") pod \"isvc-triton-predictor-84bb65d94b-jvtld\" (UID: \"a4fe0617-f47f-471d-a3af-2a1838349273\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" Apr 28 20:04:56.651017 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.651017 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a4fe0617-f47f-471d-a3af-2a1838349273-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-jvtld\" (UID: \"a4fe0617-f47f-471d-a3af-2a1838349273\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" Apr 28 20:04:56.651224 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.651055 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4fe0617-f47f-471d-a3af-2a1838349273-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-jvtld\" (UID: \"a4fe0617-f47f-471d-a3af-2a1838349273\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" Apr 28 20:04:56.651224 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.651097 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a4fe0617-f47f-471d-a3af-2a1838349273-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-jvtld\" (UID: \"a4fe0617-f47f-471d-a3af-2a1838349273\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" Apr 28 20:04:56.752073 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.752039 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkp5s\" (UniqueName: \"kubernetes.io/projected/a4fe0617-f47f-471d-a3af-2a1838349273-kube-api-access-wkp5s\") pod \"isvc-triton-predictor-84bb65d94b-jvtld\" (UID: \"a4fe0617-f47f-471d-a3af-2a1838349273\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" Apr 28 20:04:56.752245 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.752079 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a4fe0617-f47f-471d-a3af-2a1838349273-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-jvtld\" (UID: \"a4fe0617-f47f-471d-a3af-2a1838349273\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" Apr 28 20:04:56.752245 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.752123 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4fe0617-f47f-471d-a3af-2a1838349273-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-jvtld\" (UID: \"a4fe0617-f47f-471d-a3af-2a1838349273\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" Apr 28 20:04:56.752245 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.752155 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a4fe0617-f47f-471d-a3af-2a1838349273-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-jvtld\" (UID: \"a4fe0617-f47f-471d-a3af-2a1838349273\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" Apr 28 20:04:56.752654 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.752627 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a4fe0617-f47f-471d-a3af-2a1838349273-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-jvtld\" (UID: \"a4fe0617-f47f-471d-a3af-2a1838349273\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" Apr 28 20:04:56.752904 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.752887 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a4fe0617-f47f-471d-a3af-2a1838349273-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-jvtld\" (UID: \"a4fe0617-f47f-471d-a3af-2a1838349273\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" Apr 28 20:04:56.754668 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.754648 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4fe0617-f47f-471d-a3af-2a1838349273-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-jvtld\" (UID: \"a4fe0617-f47f-471d-a3af-2a1838349273\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" Apr 28 20:04:56.760949 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.760913 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkp5s\" (UniqueName: \"kubernetes.io/projected/a4fe0617-f47f-471d-a3af-2a1838349273-kube-api-access-wkp5s\") pod \"isvc-triton-predictor-84bb65d94b-jvtld\" (UID: \"a4fe0617-f47f-471d-a3af-2a1838349273\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" Apr 28 20:04:56.802938 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.802900 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" Apr 28 20:04:56.923797 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:56.923773 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld"] Apr 28 20:04:56.926359 ip-10-0-143-206 kubenswrapper[2539]: W0428 20:04:56.926333 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4fe0617_f47f_471d_a3af_2a1838349273.slice/crio-d20aaaff6121e53438fc192a8682ecffa43828512813615f4539125d83a03b7e WatchSource:0}: Error finding container d20aaaff6121e53438fc192a8682ecffa43828512813615f4539125d83a03b7e: Status 404 returned error can't find the container with id d20aaaff6121e53438fc192a8682ecffa43828512813615f4539125d83a03b7e Apr 28 20:04:57.622327 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:57.622292 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" event={"ID":"a4fe0617-f47f-471d-a3af-2a1838349273","Type":"ContainerStarted","Data":"d3f492248a27e2b0a6549f827fdb1081786ed96e7a383c0d14045288ae540137"} Apr 28 20:04:57.622327 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:57.622326 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" event={"ID":"a4fe0617-f47f-471d-a3af-2a1838349273","Type":"ContainerStarted","Data":"d20aaaff6121e53438fc192a8682ecffa43828512813615f4539125d83a03b7e"} Apr 28 20:04:58.516439 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:04:58.516395 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" podUID="7462b069-f01a-4b3b-9148-4162e29d0b0c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 28 20:05:01.635740 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:01.635702 2539 generic.go:358] "Generic (PLEG): container finished" podID="a4fe0617-f47f-471d-a3af-2a1838349273" containerID="d3f492248a27e2b0a6549f827fdb1081786ed96e7a383c0d14045288ae540137" exitCode=0 Apr 28 20:05:01.636127 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:01.635778 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" event={"ID":"a4fe0617-f47f-471d-a3af-2a1838349273","Type":"ContainerDied","Data":"d3f492248a27e2b0a6549f827fdb1081786ed96e7a383c0d14045288ae540137"} Apr 28 20:05:03.516963 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:03.516737 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" podUID="7462b069-f01a-4b3b-9148-4162e29d0b0c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 28 20:05:08.517104 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:08.516988 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" podUID="7462b069-f01a-4b3b-9148-4162e29d0b0c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 28 20:05:08.517773 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:08.517344 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" Apr 28 20:05:13.516300 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:13.516252 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" podUID="7462b069-f01a-4b3b-9148-4162e29d0b0c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 28 20:05:18.517305 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:18.517257 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" podUID="7462b069-f01a-4b3b-9148-4162e29d0b0c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 28 20:05:23.516194 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:23.516141 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" podUID="7462b069-f01a-4b3b-9148-4162e29d0b0c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 28 20:05:26.747272 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:26.747149 2539 generic.go:358] "Generic (PLEG): container finished" podID="7462b069-f01a-4b3b-9148-4162e29d0b0c" containerID="e48448aa4cd31c532bf9590b4b7425bea3bbbffd5567704850d683acdfece53a" exitCode=137 Apr 28 20:05:26.747272 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:26.747209 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" event={"ID":"7462b069-f01a-4b3b-9148-4162e29d0b0c","Type":"ContainerDied","Data":"e48448aa4cd31c532bf9590b4b7425bea3bbbffd5567704850d683acdfece53a"} Apr 28 20:05:26.870432 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:26.870399 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" Apr 28 20:05:26.930775 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:26.930728 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7t2b\" (UniqueName: \"kubernetes.io/projected/7462b069-f01a-4b3b-9148-4162e29d0b0c-kube-api-access-q7t2b\") pod \"7462b069-f01a-4b3b-9148-4162e29d0b0c\" (UID: \"7462b069-f01a-4b3b-9148-4162e29d0b0c\") " Apr 28 20:05:26.931096 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:26.930803 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7462b069-f01a-4b3b-9148-4162e29d0b0c-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"7462b069-f01a-4b3b-9148-4162e29d0b0c\" (UID: \"7462b069-f01a-4b3b-9148-4162e29d0b0c\") " Apr 28 20:05:26.931224 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:26.931163 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7462b069-f01a-4b3b-9148-4162e29d0b0c-proxy-tls\") pod \"7462b069-f01a-4b3b-9148-4162e29d0b0c\" (UID: \"7462b069-f01a-4b3b-9148-4162e29d0b0c\") " Apr 28 20:05:26.931307 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:26.931275 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7462b069-f01a-4b3b-9148-4162e29d0b0c-kserve-provision-location\") pod \"7462b069-f01a-4b3b-9148-4162e29d0b0c\" (UID: \"7462b069-f01a-4b3b-9148-4162e29d0b0c\") " Apr 28 20:05:26.931836 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:26.931800 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7462b069-f01a-4b3b-9148-4162e29d0b0c-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config") pod "7462b069-f01a-4b3b-9148-4162e29d0b0c" (UID: "7462b069-f01a-4b3b-9148-4162e29d0b0c"). InnerVolumeSpecName "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:05:26.934284 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:26.934241 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7462b069-f01a-4b3b-9148-4162e29d0b0c-kube-api-access-q7t2b" (OuterVolumeSpecName: "kube-api-access-q7t2b") pod "7462b069-f01a-4b3b-9148-4162e29d0b0c" (UID: "7462b069-f01a-4b3b-9148-4162e29d0b0c"). InnerVolumeSpecName "kube-api-access-q7t2b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:05:26.936623 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:26.936586 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7462b069-f01a-4b3b-9148-4162e29d0b0c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7462b069-f01a-4b3b-9148-4162e29d0b0c" (UID: "7462b069-f01a-4b3b-9148-4162e29d0b0c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:05:26.944709 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:26.944677 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7462b069-f01a-4b3b-9148-4162e29d0b0c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7462b069-f01a-4b3b-9148-4162e29d0b0c" (UID: "7462b069-f01a-4b3b-9148-4162e29d0b0c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:05:27.032308 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:27.032276 2539 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7462b069-f01a-4b3b-9148-4162e29d0b0c-kserve-provision-location\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:05:27.032308 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:27.032308 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q7t2b\" (UniqueName: \"kubernetes.io/projected/7462b069-f01a-4b3b-9148-4162e29d0b0c-kube-api-access-q7t2b\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:05:27.032554 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:27.032324 2539 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7462b069-f01a-4b3b-9148-4162e29d0b0c-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:05:27.032554 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:27.032337 2539 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7462b069-f01a-4b3b-9148-4162e29d0b0c-proxy-tls\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:05:27.753098 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:27.752917 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" event={"ID":"7462b069-f01a-4b3b-9148-4162e29d0b0c","Type":"ContainerDied","Data":"8ca0cd8b6640b9797165ec0b58d544b5ecb9ad247234d2c812fcd237602e21f8"} Apr 28 20:05:27.753098 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:27.752991 2539 scope.go:117] "RemoveContainer" containerID="b2b4a50861a2fea39f0c39b95ed4a2361e0e611e07dd3837e1a975af78156ca0" Apr 28 20:05:27.753098 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:27.753036 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5" Apr 28 20:05:27.763479 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:27.763364 2539 scope.go:117] "RemoveContainer" containerID="e48448aa4cd31c532bf9590b4b7425bea3bbbffd5567704850d683acdfece53a" Apr 28 20:05:27.774154 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:27.774121 2539 scope.go:117] "RemoveContainer" containerID="d213a98f1c80144a72314478fc40a0d430adf1b1c6c08e5fe42d3cc69106d24e" Apr 28 20:05:27.785804 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:27.785758 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5"] Apr 28 20:05:27.789815 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:27.789787 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-fh7l5"] Apr 28 20:05:28.273385 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:05:28.273329 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7462b069-f01a-4b3b-9148-4162e29d0b0c" path="/var/lib/kubelet/pods/7462b069-f01a-4b3b-9148-4162e29d0b0c/volumes" Apr 28 20:06:55.511088 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:06:55.511010 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 20:06:55.511088 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:06:55.511018 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 20:06:57.037417 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:06:57.037311 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" event={"ID":"a4fe0617-f47f-471d-a3af-2a1838349273","Type":"ContainerStarted","Data":"3947e685727a18e709077571a5dca59982107ed83057a7a285342ce631716954"} Apr 28 20:06:57.037417 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:06:57.037350 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" event={"ID":"a4fe0617-f47f-471d-a3af-2a1838349273","Type":"ContainerStarted","Data":"da401339507fb7c8a665e4e9648b0eb56afa1945f63042152dc7478fdb31d0c6"} Apr 28 20:06:57.037417 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:06:57.037410 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" Apr 28 20:06:57.082519 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:06:57.082452 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" podStartSLOduration=5.935799402 podStartE2EDuration="2m1.082432817s" podCreationTimestamp="2026-04-28 20:04:56 +0000 UTC" firstStartedPulling="2026-04-28 20:05:01.636847562 +0000 UTC m=+2927.933202905" lastFinishedPulling="2026-04-28 20:06:56.783480975 +0000 UTC m=+3043.079836320" observedRunningTime="2026-04-28 20:06:57.080100996 +0000 UTC m=+3043.376456361" watchObservedRunningTime="2026-04-28 20:06:57.082432817 +0000 UTC m=+3043.378788182" Apr 28 20:06:58.042339 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:06:58.042306 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" Apr 28 20:06:58.045333 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:06:58.045188 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" podUID="a4fe0617-f47f-471d-a3af-2a1838349273" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 28 20:06:59.044678 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:06:59.044641 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" podUID="a4fe0617-f47f-471d-a3af-2a1838349273" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 28 20:07:04.049772 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:04.049742 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" Apr 28 20:07:04.050430 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:04.050414 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" Apr 28 20:07:07.972493 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:07.972461 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld"] Apr 28 20:07:07.972950 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:07.972762 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" podUID="a4fe0617-f47f-471d-a3af-2a1838349273" containerName="kserve-container" containerID="cri-o://da401339507fb7c8a665e4e9648b0eb56afa1945f63042152dc7478fdb31d0c6" gracePeriod=30 Apr 28 20:07:07.972950 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:07.972790 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" podUID="a4fe0617-f47f-471d-a3af-2a1838349273" containerName="kube-rbac-proxy" containerID="cri-o://3947e685727a18e709077571a5dca59982107ed83057a7a285342ce631716954" gracePeriod=30 Apr 28 20:07:08.289936 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.289905 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78"] Apr 28 20:07:08.290203 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.290192 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7462b069-f01a-4b3b-9148-4162e29d0b0c" containerName="kserve-container" Apr 28 20:07:08.290262 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.290205 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="7462b069-f01a-4b3b-9148-4162e29d0b0c" containerName="kserve-container" Apr 28 20:07:08.290262 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.290223 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7462b069-f01a-4b3b-9148-4162e29d0b0c" containerName="kube-rbac-proxy" Apr 28 20:07:08.290262 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.290229 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="7462b069-f01a-4b3b-9148-4162e29d0b0c" containerName="kube-rbac-proxy" Apr 28 20:07:08.290262 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.290236 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7462b069-f01a-4b3b-9148-4162e29d0b0c" containerName="storage-initializer" Apr 28 20:07:08.290262 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.290243 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="7462b069-f01a-4b3b-9148-4162e29d0b0c" containerName="storage-initializer" Apr 28 20:07:08.290442 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.290285 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="7462b069-f01a-4b3b-9148-4162e29d0b0c" containerName="kube-rbac-proxy" Apr 28 20:07:08.290442 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.290294 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="7462b069-f01a-4b3b-9148-4162e29d0b0c" containerName="kserve-container" Apr 28 20:07:08.314170 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.314127 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78"] Apr 28 20:07:08.314347 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.314256 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" Apr 28 20:07:08.316397 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.316347 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-predictor-serving-cert\"" Apr 28 20:07:08.316515 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.316347 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-kube-rbac-proxy-sar-config\"" Apr 28 20:07:08.454954 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.454918 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8jtb\" (UniqueName: \"kubernetes.io/projected/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-kube-api-access-g8jtb\") pod \"isvc-xgboost-predictor-8689c4cfcc-w8f78\" (UID: \"7238d14d-1be1-44bb-95b7-b55e7e9e15b0\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" Apr 28 20:07:08.455129 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.454962 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-w8f78\" (UID: \"7238d14d-1be1-44bb-95b7-b55e7e9e15b0\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" Apr 28 20:07:08.455129 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.455064 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-w8f78\" (UID: \"7238d14d-1be1-44bb-95b7-b55e7e9e15b0\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" Apr 28 20:07:08.455129 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.455091 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-w8f78\" (UID: \"7238d14d-1be1-44bb-95b7-b55e7e9e15b0\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" Apr 28 20:07:08.556047 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.555957 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-w8f78\" (UID: \"7238d14d-1be1-44bb-95b7-b55e7e9e15b0\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" Apr 28 20:07:08.556047 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.556004 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-w8f78\" (UID: \"7238d14d-1be1-44bb-95b7-b55e7e9e15b0\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" Apr 28 20:07:08.556047 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.556025 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8jtb\" (UniqueName: \"kubernetes.io/projected/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-kube-api-access-g8jtb\") pod \"isvc-xgboost-predictor-8689c4cfcc-w8f78\" (UID: \"7238d14d-1be1-44bb-95b7-b55e7e9e15b0\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" Apr 28 20:07:08.556291 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.556053 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-w8f78\" (UID: \"7238d14d-1be1-44bb-95b7-b55e7e9e15b0\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" Apr 28 20:07:08.556492 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.556470 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-w8f78\" (UID: \"7238d14d-1be1-44bb-95b7-b55e7e9e15b0\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" Apr 28 20:07:08.556756 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.556738 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-w8f78\" (UID: \"7238d14d-1be1-44bb-95b7-b55e7e9e15b0\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" Apr 28 20:07:08.558498 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.558481 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-w8f78\" (UID: \"7238d14d-1be1-44bb-95b7-b55e7e9e15b0\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" Apr 28 20:07:08.564694 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.564671 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8jtb\" (UniqueName: \"kubernetes.io/projected/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-kube-api-access-g8jtb\") pod \"isvc-xgboost-predictor-8689c4cfcc-w8f78\" (UID: \"7238d14d-1be1-44bb-95b7-b55e7e9e15b0\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" Apr 28 20:07:08.624821 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.624782 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" Apr 28 20:07:08.748222 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:08.748192 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78"] Apr 28 20:07:08.750953 ip-10-0-143-206 kubenswrapper[2539]: W0428 20:07:08.750923 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7238d14d_1be1_44bb_95b7_b55e7e9e15b0.slice/crio-24dac4f9fa590371f88a726cc7528146b46978c2526475739a244be16aea2a9a WatchSource:0}: Error finding container 24dac4f9fa590371f88a726cc7528146b46978c2526475739a244be16aea2a9a: Status 404 returned error can't find the container with id 24dac4f9fa590371f88a726cc7528146b46978c2526475739a244be16aea2a9a Apr 28 20:07:09.045284 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:09.045238 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" podUID="a4fe0617-f47f-471d-a3af-2a1838349273" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.33:8643/healthz\": dial tcp 10.133.0.33:8643: connect: connection refused" Apr 28 20:07:09.072431 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:09.072349 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" event={"ID":"7238d14d-1be1-44bb-95b7-b55e7e9e15b0","Type":"ContainerStarted","Data":"da2d7deab54be9f1f14bf4b7a96625648955596fd9acbf2b275b1be911c31058"} Apr 28 20:07:09.072431 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:09.072410 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" event={"ID":"7238d14d-1be1-44bb-95b7-b55e7e9e15b0","Type":"ContainerStarted","Data":"24dac4f9fa590371f88a726cc7528146b46978c2526475739a244be16aea2a9a"} Apr 28 20:07:09.074300 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:09.074265 2539 generic.go:358] "Generic (PLEG): container finished" podID="a4fe0617-f47f-471d-a3af-2a1838349273" containerID="3947e685727a18e709077571a5dca59982107ed83057a7a285342ce631716954" exitCode=2 Apr 28 20:07:09.074435 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:09.074335 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" event={"ID":"a4fe0617-f47f-471d-a3af-2a1838349273","Type":"ContainerDied","Data":"3947e685727a18e709077571a5dca59982107ed83057a7a285342ce631716954"} Apr 28 20:07:10.837097 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:10.837038 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" Apr 28 20:07:10.975294 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:10.975256 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkp5s\" (UniqueName: \"kubernetes.io/projected/a4fe0617-f47f-471d-a3af-2a1838349273-kube-api-access-wkp5s\") pod \"a4fe0617-f47f-471d-a3af-2a1838349273\" (UID: \"a4fe0617-f47f-471d-a3af-2a1838349273\") " Apr 28 20:07:10.975294 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:10.975302 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4fe0617-f47f-471d-a3af-2a1838349273-proxy-tls\") pod \"a4fe0617-f47f-471d-a3af-2a1838349273\" (UID: \"a4fe0617-f47f-471d-a3af-2a1838349273\") " Apr 28 20:07:10.975637 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:10.975358 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a4fe0617-f47f-471d-a3af-2a1838349273-isvc-triton-kube-rbac-proxy-sar-config\") pod \"a4fe0617-f47f-471d-a3af-2a1838349273\" (UID: \"a4fe0617-f47f-471d-a3af-2a1838349273\") " Apr 28 20:07:10.975637 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:10.975506 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a4fe0617-f47f-471d-a3af-2a1838349273-kserve-provision-location\") pod \"a4fe0617-f47f-471d-a3af-2a1838349273\" (UID: \"a4fe0617-f47f-471d-a3af-2a1838349273\") " Apr 28 20:07:10.975774 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:10.975736 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4fe0617-f47f-471d-a3af-2a1838349273-isvc-triton-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-triton-kube-rbac-proxy-sar-config") pod "a4fe0617-f47f-471d-a3af-2a1838349273" (UID: "a4fe0617-f47f-471d-a3af-2a1838349273"). InnerVolumeSpecName "isvc-triton-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:07:10.975904 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:10.975880 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4fe0617-f47f-471d-a3af-2a1838349273-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a4fe0617-f47f-471d-a3af-2a1838349273" (UID: "a4fe0617-f47f-471d-a3af-2a1838349273"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:07:10.977539 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:10.977511 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4fe0617-f47f-471d-a3af-2a1838349273-kube-api-access-wkp5s" (OuterVolumeSpecName: "kube-api-access-wkp5s") pod "a4fe0617-f47f-471d-a3af-2a1838349273" (UID: "a4fe0617-f47f-471d-a3af-2a1838349273"). InnerVolumeSpecName "kube-api-access-wkp5s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:07:10.977644 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:10.977587 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4fe0617-f47f-471d-a3af-2a1838349273-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a4fe0617-f47f-471d-a3af-2a1838349273" (UID: "a4fe0617-f47f-471d-a3af-2a1838349273"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:07:11.076849 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:11.076811 2539 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a4fe0617-f47f-471d-a3af-2a1838349273-kserve-provision-location\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:07:11.076849 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:11.076845 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wkp5s\" (UniqueName: \"kubernetes.io/projected/a4fe0617-f47f-471d-a3af-2a1838349273-kube-api-access-wkp5s\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:07:11.076849 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:11.076856 2539 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4fe0617-f47f-471d-a3af-2a1838349273-proxy-tls\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:07:11.077315 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:11.076867 2539 reconciler_common.go:299] "Volume detached for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a4fe0617-f47f-471d-a3af-2a1838349273-isvc-triton-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:07:11.081624 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:11.081597 2539 generic.go:358] "Generic (PLEG): container finished" podID="a4fe0617-f47f-471d-a3af-2a1838349273" containerID="da401339507fb7c8a665e4e9648b0eb56afa1945f63042152dc7478fdb31d0c6" exitCode=0 Apr 28 20:07:11.081768 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:11.081637 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" event={"ID":"a4fe0617-f47f-471d-a3af-2a1838349273","Type":"ContainerDied","Data":"da401339507fb7c8a665e4e9648b0eb56afa1945f63042152dc7478fdb31d0c6"} Apr 28 20:07:11.081768 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:11.081660 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" event={"ID":"a4fe0617-f47f-471d-a3af-2a1838349273","Type":"ContainerDied","Data":"d20aaaff6121e53438fc192a8682ecffa43828512813615f4539125d83a03b7e"} Apr 28 20:07:11.081768 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:11.081675 2539 scope.go:117] "RemoveContainer" containerID="3947e685727a18e709077571a5dca59982107ed83057a7a285342ce631716954" Apr 28 20:07:11.081768 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:11.081679 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld" Apr 28 20:07:11.089486 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:11.089265 2539 scope.go:117] "RemoveContainer" containerID="da401339507fb7c8a665e4e9648b0eb56afa1945f63042152dc7478fdb31d0c6" Apr 28 20:07:11.096275 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:11.096258 2539 scope.go:117] "RemoveContainer" containerID="d3f492248a27e2b0a6549f827fdb1081786ed96e7a383c0d14045288ae540137" Apr 28 20:07:11.103313 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:11.103296 2539 scope.go:117] "RemoveContainer" containerID="3947e685727a18e709077571a5dca59982107ed83057a7a285342ce631716954" Apr 28 20:07:11.103487 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:11.103464 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld"] Apr 28 20:07:11.103610 ip-10-0-143-206 kubenswrapper[2539]: E0428 20:07:11.103595 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3947e685727a18e709077571a5dca59982107ed83057a7a285342ce631716954\": container with ID starting with 3947e685727a18e709077571a5dca59982107ed83057a7a285342ce631716954 not found: ID does not exist" containerID="3947e685727a18e709077571a5dca59982107ed83057a7a285342ce631716954" Apr 28 20:07:11.103658 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:11.103619 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3947e685727a18e709077571a5dca59982107ed83057a7a285342ce631716954"} err="failed to get container status \"3947e685727a18e709077571a5dca59982107ed83057a7a285342ce631716954\": rpc error: code = NotFound desc = could not find container \"3947e685727a18e709077571a5dca59982107ed83057a7a285342ce631716954\": container with ID starting with 3947e685727a18e709077571a5dca59982107ed83057a7a285342ce631716954 not found: ID does not exist" Apr 28 20:07:11.103658 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:11.103637 2539 scope.go:117] "RemoveContainer" containerID="da401339507fb7c8a665e4e9648b0eb56afa1945f63042152dc7478fdb31d0c6" Apr 28 20:07:11.103889 ip-10-0-143-206 kubenswrapper[2539]: E0428 20:07:11.103868 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da401339507fb7c8a665e4e9648b0eb56afa1945f63042152dc7478fdb31d0c6\": container with ID starting with da401339507fb7c8a665e4e9648b0eb56afa1945f63042152dc7478fdb31d0c6 not found: ID does not exist" containerID="da401339507fb7c8a665e4e9648b0eb56afa1945f63042152dc7478fdb31d0c6" Apr 28 20:07:11.103934 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:11.103900 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da401339507fb7c8a665e4e9648b0eb56afa1945f63042152dc7478fdb31d0c6"} err="failed to get container status \"da401339507fb7c8a665e4e9648b0eb56afa1945f63042152dc7478fdb31d0c6\": rpc error: code = NotFound desc = could not find container \"da401339507fb7c8a665e4e9648b0eb56afa1945f63042152dc7478fdb31d0c6\": container with ID starting with da401339507fb7c8a665e4e9648b0eb56afa1945f63042152dc7478fdb31d0c6 not found: ID does not exist" Apr 28 20:07:11.103934 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:11.103924 2539 scope.go:117] "RemoveContainer" containerID="d3f492248a27e2b0a6549f827fdb1081786ed96e7a383c0d14045288ae540137" Apr 28 20:07:11.104145 ip-10-0-143-206 kubenswrapper[2539]: E0428 20:07:11.104129 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3f492248a27e2b0a6549f827fdb1081786ed96e7a383c0d14045288ae540137\": container with ID starting with d3f492248a27e2b0a6549f827fdb1081786ed96e7a383c0d14045288ae540137 not found: ID does not exist" containerID="d3f492248a27e2b0a6549f827fdb1081786ed96e7a383c0d14045288ae540137" Apr 28 20:07:11.104199 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:11.104150 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3f492248a27e2b0a6549f827fdb1081786ed96e7a383c0d14045288ae540137"} err="failed to get container status \"d3f492248a27e2b0a6549f827fdb1081786ed96e7a383c0d14045288ae540137\": rpc error: code = NotFound desc = could not find container \"d3f492248a27e2b0a6549f827fdb1081786ed96e7a383c0d14045288ae540137\": container with ID starting with d3f492248a27e2b0a6549f827fdb1081786ed96e7a383c0d14045288ae540137 not found: ID does not exist" Apr 28 20:07:11.109499 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:11.109479 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-jvtld"] Apr 28 20:07:12.271407 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:12.271357 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4fe0617-f47f-471d-a3af-2a1838349273" path="/var/lib/kubelet/pods/a4fe0617-f47f-471d-a3af-2a1838349273/volumes" Apr 28 20:07:13.088785 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:13.088750 2539 generic.go:358] "Generic (PLEG): container finished" podID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerID="da2d7deab54be9f1f14bf4b7a96625648955596fd9acbf2b275b1be911c31058" exitCode=0 Apr 28 20:07:13.088966 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:13.088825 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" event={"ID":"7238d14d-1be1-44bb-95b7-b55e7e9e15b0","Type":"ContainerDied","Data":"da2d7deab54be9f1f14bf4b7a96625648955596fd9acbf2b275b1be911c31058"} Apr 28 20:07:35.155516 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:35.155479 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" event={"ID":"7238d14d-1be1-44bb-95b7-b55e7e9e15b0","Type":"ContainerStarted","Data":"4294d12e78ff6f4a87b955978a1366cbc327dac89043dda492ba267d2df9a98e"} Apr 28 20:07:35.155964 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:35.155522 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" event={"ID":"7238d14d-1be1-44bb-95b7-b55e7e9e15b0","Type":"ContainerStarted","Data":"d4aaad0d6ff7ec79158f3130f73779c67cf1734250820029335b424b1ba41fc0"} Apr 28 20:07:35.155964 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:35.155881 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" Apr 28 20:07:35.155964 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:35.155908 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" Apr 28 20:07:35.157340 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:35.157315 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" podUID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 28 20:07:35.177482 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:35.177420 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" podStartSLOduration=5.610415908 podStartE2EDuration="27.177404063s" podCreationTimestamp="2026-04-28 20:07:08 +0000 UTC" firstStartedPulling="2026-04-28 20:07:13.090053157 +0000 UTC m=+3059.386408499" lastFinishedPulling="2026-04-28 20:07:34.657041308 +0000 UTC m=+3080.953396654" observedRunningTime="2026-04-28 20:07:35.176039722 +0000 UTC m=+3081.472395110" watchObservedRunningTime="2026-04-28 20:07:35.177404063 +0000 UTC m=+3081.473759485" Apr 28 20:07:36.158949 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:36.158906 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" podUID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 28 20:07:41.163023 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:41.162996 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" Apr 28 20:07:41.163653 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:41.163623 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" podUID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 28 20:07:51.163720 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:07:51.163676 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" podUID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 28 20:08:01.164515 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:01.164476 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" podUID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 28 20:08:11.163712 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:11.163670 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" podUID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 28 20:08:21.164064 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:21.164021 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" podUID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 28 20:08:31.164439 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:31.164335 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" podUID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 28 20:08:41.165091 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:41.165062 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" Apr 28 20:08:47.669665 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:47.669619 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78"] Apr 28 20:08:47.670172 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:47.670025 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" podUID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerName="kserve-container" containerID="cri-o://d4aaad0d6ff7ec79158f3130f73779c67cf1734250820029335b424b1ba41fc0" gracePeriod=30 Apr 28 20:08:47.670172 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:47.670081 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" podUID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerName="kube-rbac-proxy" containerID="cri-o://4294d12e78ff6f4a87b955978a1366cbc327dac89043dda492ba267d2df9a98e" gracePeriod=30 Apr 28 20:08:48.359997 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:48.359962 2539 generic.go:358] "Generic (PLEG): container finished" podID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerID="4294d12e78ff6f4a87b955978a1366cbc327dac89043dda492ba267d2df9a98e" exitCode=2 Apr 28 20:08:48.360183 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:48.360026 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" event={"ID":"7238d14d-1be1-44bb-95b7-b55e7e9e15b0","Type":"ContainerDied","Data":"4294d12e78ff6f4a87b955978a1366cbc327dac89043dda492ba267d2df9a98e"} Apr 28 20:08:51.159838 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:51.159793 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" podUID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.34:8643/healthz\": dial tcp 10.133.0.34:8643: connect: connection refused" Apr 28 20:08:51.164101 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:51.164074 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" podUID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 28 20:08:51.509016 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:51.508989 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" Apr 28 20:08:51.625018 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:51.624980 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-kserve-provision-location\") pod \"7238d14d-1be1-44bb-95b7-b55e7e9e15b0\" (UID: \"7238d14d-1be1-44bb-95b7-b55e7e9e15b0\") " Apr 28 20:08:51.625018 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:51.625026 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8jtb\" (UniqueName: \"kubernetes.io/projected/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-kube-api-access-g8jtb\") pod \"7238d14d-1be1-44bb-95b7-b55e7e9e15b0\" (UID: \"7238d14d-1be1-44bb-95b7-b55e7e9e15b0\") " Apr 28 20:08:51.625239 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:51.625069 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-proxy-tls\") pod \"7238d14d-1be1-44bb-95b7-b55e7e9e15b0\" (UID: \"7238d14d-1be1-44bb-95b7-b55e7e9e15b0\") " Apr 28 20:08:51.625239 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:51.625151 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"7238d14d-1be1-44bb-95b7-b55e7e9e15b0\" (UID: \"7238d14d-1be1-44bb-95b7-b55e7e9e15b0\") " Apr 28 20:08:51.625395 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:51.625342 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7238d14d-1be1-44bb-95b7-b55e7e9e15b0" (UID: "7238d14d-1be1-44bb-95b7-b55e7e9e15b0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:08:51.625639 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:51.625612 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-isvc-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-kube-rbac-proxy-sar-config") pod "7238d14d-1be1-44bb-95b7-b55e7e9e15b0" (UID: "7238d14d-1be1-44bb-95b7-b55e7e9e15b0"). InnerVolumeSpecName "isvc-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:08:51.627199 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:51.627168 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-kube-api-access-g8jtb" (OuterVolumeSpecName: "kube-api-access-g8jtb") pod "7238d14d-1be1-44bb-95b7-b55e7e9e15b0" (UID: "7238d14d-1be1-44bb-95b7-b55e7e9e15b0"). InnerVolumeSpecName "kube-api-access-g8jtb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:08:51.627199 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:51.627185 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7238d14d-1be1-44bb-95b7-b55e7e9e15b0" (UID: "7238d14d-1be1-44bb-95b7-b55e7e9e15b0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:08:51.726488 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:51.726408 2539 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-isvc-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:08:51.726488 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:51.726437 2539 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-kserve-provision-location\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:08:51.726488 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:51.726448 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g8jtb\" (UniqueName: \"kubernetes.io/projected/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-kube-api-access-g8jtb\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:08:51.726488 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:51.726461 2539 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7238d14d-1be1-44bb-95b7-b55e7e9e15b0-proxy-tls\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:08:52.372040 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:52.372005 2539 generic.go:358] "Generic (PLEG): container finished" podID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerID="d4aaad0d6ff7ec79158f3130f73779c67cf1734250820029335b424b1ba41fc0" exitCode=0 Apr 28 20:08:52.372464 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:52.372047 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" event={"ID":"7238d14d-1be1-44bb-95b7-b55e7e9e15b0","Type":"ContainerDied","Data":"d4aaad0d6ff7ec79158f3130f73779c67cf1734250820029335b424b1ba41fc0"} Apr 28 20:08:52.372464 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:52.372081 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" Apr 28 20:08:52.372464 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:52.372091 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78" event={"ID":"7238d14d-1be1-44bb-95b7-b55e7e9e15b0","Type":"ContainerDied","Data":"24dac4f9fa590371f88a726cc7528146b46978c2526475739a244be16aea2a9a"} Apr 28 20:08:52.372464 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:52.372112 2539 scope.go:117] "RemoveContainer" containerID="4294d12e78ff6f4a87b955978a1366cbc327dac89043dda492ba267d2df9a98e" Apr 28 20:08:52.380225 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:52.380208 2539 scope.go:117] "RemoveContainer" containerID="d4aaad0d6ff7ec79158f3130f73779c67cf1734250820029335b424b1ba41fc0" Apr 28 20:08:52.387364 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:52.387173 2539 scope.go:117] "RemoveContainer" containerID="da2d7deab54be9f1f14bf4b7a96625648955596fd9acbf2b275b1be911c31058" Apr 28 20:08:52.392735 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:52.392713 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78"] Apr 28 20:08:52.394677 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:52.394657 2539 scope.go:117] "RemoveContainer" containerID="4294d12e78ff6f4a87b955978a1366cbc327dac89043dda492ba267d2df9a98e" Apr 28 20:08:52.394973 ip-10-0-143-206 kubenswrapper[2539]: E0428 20:08:52.394950 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4294d12e78ff6f4a87b955978a1366cbc327dac89043dda492ba267d2df9a98e\": container with ID starting with 4294d12e78ff6f4a87b955978a1366cbc327dac89043dda492ba267d2df9a98e not found: ID does not exist" containerID="4294d12e78ff6f4a87b955978a1366cbc327dac89043dda492ba267d2df9a98e" Apr 28 20:08:52.395069 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:52.394987 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4294d12e78ff6f4a87b955978a1366cbc327dac89043dda492ba267d2df9a98e"} err="failed to get container status \"4294d12e78ff6f4a87b955978a1366cbc327dac89043dda492ba267d2df9a98e\": rpc error: code = NotFound desc = could not find container \"4294d12e78ff6f4a87b955978a1366cbc327dac89043dda492ba267d2df9a98e\": container with ID starting with 4294d12e78ff6f4a87b955978a1366cbc327dac89043dda492ba267d2df9a98e not found: ID does not exist" Apr 28 20:08:52.395069 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:52.395005 2539 scope.go:117] "RemoveContainer" containerID="d4aaad0d6ff7ec79158f3130f73779c67cf1734250820029335b424b1ba41fc0" Apr 28 20:08:52.395288 ip-10-0-143-206 kubenswrapper[2539]: E0428 20:08:52.395267 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4aaad0d6ff7ec79158f3130f73779c67cf1734250820029335b424b1ba41fc0\": container with ID starting with d4aaad0d6ff7ec79158f3130f73779c67cf1734250820029335b424b1ba41fc0 not found: ID does not exist" containerID="d4aaad0d6ff7ec79158f3130f73779c67cf1734250820029335b424b1ba41fc0" Apr 28 20:08:52.395391 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:52.395293 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4aaad0d6ff7ec79158f3130f73779c67cf1734250820029335b424b1ba41fc0"} err="failed to get container status \"d4aaad0d6ff7ec79158f3130f73779c67cf1734250820029335b424b1ba41fc0\": rpc error: code = NotFound desc = could not find container \"d4aaad0d6ff7ec79158f3130f73779c67cf1734250820029335b424b1ba41fc0\": container with ID starting with d4aaad0d6ff7ec79158f3130f73779c67cf1734250820029335b424b1ba41fc0 not found: ID does not exist" Apr 28 20:08:52.395391 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:52.395307 2539 scope.go:117] "RemoveContainer" containerID="da2d7deab54be9f1f14bf4b7a96625648955596fd9acbf2b275b1be911c31058" Apr 28 20:08:52.395560 ip-10-0-143-206 kubenswrapper[2539]: E0428 20:08:52.395534 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da2d7deab54be9f1f14bf4b7a96625648955596fd9acbf2b275b1be911c31058\": container with ID starting with da2d7deab54be9f1f14bf4b7a96625648955596fd9acbf2b275b1be911c31058 not found: ID does not exist" containerID="da2d7deab54be9f1f14bf4b7a96625648955596fd9acbf2b275b1be911c31058" Apr 28 20:08:52.395601 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:52.395565 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da2d7deab54be9f1f14bf4b7a96625648955596fd9acbf2b275b1be911c31058"} err="failed to get container status \"da2d7deab54be9f1f14bf4b7a96625648955596fd9acbf2b275b1be911c31058\": rpc error: code = NotFound desc = could not find container \"da2d7deab54be9f1f14bf4b7a96625648955596fd9acbf2b275b1be911c31058\": container with ID starting with da2d7deab54be9f1f14bf4b7a96625648955596fd9acbf2b275b1be911c31058 not found: ID does not exist" Apr 28 20:08:52.398252 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:52.398230 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-w8f78"] Apr 28 20:08:54.272722 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:08:54.272691 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" path="/var/lib/kubelet/pods/7238d14d-1be1-44bb-95b7-b55e7e9e15b0/volumes" Apr 28 20:10:28.740853 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.740817 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr"] Apr 28 20:10:28.743741 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.741122 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerName="kube-rbac-proxy" Apr 28 20:10:28.743741 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.741135 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerName="kube-rbac-proxy" Apr 28 20:10:28.743741 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.741145 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerName="kserve-container" Apr 28 20:10:28.743741 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.741151 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerName="kserve-container" Apr 28 20:10:28.743741 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.741158 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4fe0617-f47f-471d-a3af-2a1838349273" containerName="kube-rbac-proxy" Apr 28 20:10:28.743741 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.741164 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4fe0617-f47f-471d-a3af-2a1838349273" containerName="kube-rbac-proxy" Apr 28 20:10:28.743741 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.741175 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4fe0617-f47f-471d-a3af-2a1838349273" containerName="kserve-container" Apr 28 20:10:28.743741 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.741180 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4fe0617-f47f-471d-a3af-2a1838349273" containerName="kserve-container" Apr 28 20:10:28.743741 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.741186 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4fe0617-f47f-471d-a3af-2a1838349273" containerName="storage-initializer" Apr 28 20:10:28.743741 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.741191 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4fe0617-f47f-471d-a3af-2a1838349273" containerName="storage-initializer" Apr 28 20:10:28.743741 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.741199 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerName="storage-initializer" Apr 28 20:10:28.743741 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.741204 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerName="storage-initializer" Apr 28 20:10:28.743741 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.741245 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerName="kserve-container" Apr 28 20:10:28.743741 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.741253 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4fe0617-f47f-471d-a3af-2a1838349273" containerName="kserve-container" Apr 28 20:10:28.743741 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.741259 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4fe0617-f47f-471d-a3af-2a1838349273" containerName="kube-rbac-proxy" Apr 28 20:10:28.743741 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.741265 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="7238d14d-1be1-44bb-95b7-b55e7e9e15b0" containerName="kube-rbac-proxy" Apr 28 20:10:28.744843 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.744829 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" Apr 28 20:10:28.747107 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.747077 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\"" Apr 28 20:10:28.747229 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.747114 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 28 20:10:28.747229 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.747151 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 20:10:28.747229 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.747157 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-predictor-serving-cert\"" Apr 28 20:10:28.747650 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.747634 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-44jch\"" Apr 28 20:10:28.753851 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.753831 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr"] Apr 28 20:10:28.832414 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.832355 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f62f653e-3c7b-4fbe-84b4-fa023096aebe-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-w5ldr\" (UID: \"f62f653e-3c7b-4fbe-84b4-fa023096aebe\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" Apr 28 20:10:28.832601 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.832433 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f62f653e-3c7b-4fbe-84b4-fa023096aebe-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-w5ldr\" (UID: \"f62f653e-3c7b-4fbe-84b4-fa023096aebe\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" Apr 28 20:10:28.832601 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.832467 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f62f653e-3c7b-4fbe-84b4-fa023096aebe-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-w5ldr\" (UID: \"f62f653e-3c7b-4fbe-84b4-fa023096aebe\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" Apr 28 20:10:28.832601 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.832489 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj67q\" (UniqueName: \"kubernetes.io/projected/f62f653e-3c7b-4fbe-84b4-fa023096aebe-kube-api-access-nj67q\") pod \"isvc-xgboost-runtime-predictor-779db84d9-w5ldr\" (UID: \"f62f653e-3c7b-4fbe-84b4-fa023096aebe\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" Apr 28 20:10:28.933586 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.933559 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f62f653e-3c7b-4fbe-84b4-fa023096aebe-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-w5ldr\" (UID: \"f62f653e-3c7b-4fbe-84b4-fa023096aebe\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" Apr 28 20:10:28.933775 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.933598 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f62f653e-3c7b-4fbe-84b4-fa023096aebe-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-w5ldr\" (UID: \"f62f653e-3c7b-4fbe-84b4-fa023096aebe\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" Apr 28 20:10:28.933775 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.933654 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nj67q\" (UniqueName: \"kubernetes.io/projected/f62f653e-3c7b-4fbe-84b4-fa023096aebe-kube-api-access-nj67q\") pod \"isvc-xgboost-runtime-predictor-779db84d9-w5ldr\" (UID: \"f62f653e-3c7b-4fbe-84b4-fa023096aebe\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" Apr 28 20:10:28.933775 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.933736 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f62f653e-3c7b-4fbe-84b4-fa023096aebe-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-w5ldr\" (UID: \"f62f653e-3c7b-4fbe-84b4-fa023096aebe\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" Apr 28 20:10:28.934130 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.934108 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f62f653e-3c7b-4fbe-84b4-fa023096aebe-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-w5ldr\" (UID: \"f62f653e-3c7b-4fbe-84b4-fa023096aebe\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" Apr 28 20:10:28.934298 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.934280 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f62f653e-3c7b-4fbe-84b4-fa023096aebe-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-w5ldr\" (UID: \"f62f653e-3c7b-4fbe-84b4-fa023096aebe\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" Apr 28 20:10:28.936166 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.936144 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f62f653e-3c7b-4fbe-84b4-fa023096aebe-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-w5ldr\" (UID: \"f62f653e-3c7b-4fbe-84b4-fa023096aebe\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" Apr 28 20:10:28.941803 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:28.941782 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj67q\" (UniqueName: \"kubernetes.io/projected/f62f653e-3c7b-4fbe-84b4-fa023096aebe-kube-api-access-nj67q\") pod \"isvc-xgboost-runtime-predictor-779db84d9-w5ldr\" (UID: \"f62f653e-3c7b-4fbe-84b4-fa023096aebe\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" Apr 28 20:10:29.056445 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:29.056409 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" Apr 28 20:10:29.179765 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:29.179735 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr"] Apr 28 20:10:29.182734 ip-10-0-143-206 kubenswrapper[2539]: W0428 20:10:29.182708 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf62f653e_3c7b_4fbe_84b4_fa023096aebe.slice/crio-5bc59964e94f80200d2d6ae0942cd22f681da7e96125f8021e737f908c083f8e WatchSource:0}: Error finding container 5bc59964e94f80200d2d6ae0942cd22f681da7e96125f8021e737f908c083f8e: Status 404 returned error can't find the container with id 5bc59964e94f80200d2d6ae0942cd22f681da7e96125f8021e737f908c083f8e Apr 28 20:10:29.184582 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:29.184563 2539 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 20:10:29.641279 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:29.641238 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" event={"ID":"f62f653e-3c7b-4fbe-84b4-fa023096aebe","Type":"ContainerStarted","Data":"2a0475558dfaa04fb1e3c50f5a3b17c68c3595c425967beef573e9b50746c0de"} Apr 28 20:10:29.641279 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:29.641278 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" event={"ID":"f62f653e-3c7b-4fbe-84b4-fa023096aebe","Type":"ContainerStarted","Data":"5bc59964e94f80200d2d6ae0942cd22f681da7e96125f8021e737f908c083f8e"} Apr 28 20:10:33.654224 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:33.654178 2539 generic.go:358] "Generic (PLEG): container finished" podID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerID="2a0475558dfaa04fb1e3c50f5a3b17c68c3595c425967beef573e9b50746c0de" exitCode=0 Apr 28 20:10:33.654626 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:33.654252 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" event={"ID":"f62f653e-3c7b-4fbe-84b4-fa023096aebe","Type":"ContainerDied","Data":"2a0475558dfaa04fb1e3c50f5a3b17c68c3595c425967beef573e9b50746c0de"} Apr 28 20:10:34.659408 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:34.659353 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" event={"ID":"f62f653e-3c7b-4fbe-84b4-fa023096aebe","Type":"ContainerStarted","Data":"66c953f88079b3730c29a0ad6ef78bc7a36865655c8acca760463b1c09cc3aff"} Apr 28 20:10:34.659408 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:34.659411 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" event={"ID":"f62f653e-3c7b-4fbe-84b4-fa023096aebe","Type":"ContainerStarted","Data":"a4dda44fab81ac9fe00a767737d1f297459d2608b324ca77c3f1e93e2c3cfcca"} Apr 28 20:10:34.659884 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:34.659683 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" Apr 28 20:10:34.659884 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:34.659790 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" Apr 28 20:10:34.660941 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:34.660916 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" podUID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 28 20:10:34.679935 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:34.679876 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" podStartSLOduration=6.679858951 podStartE2EDuration="6.679858951s" podCreationTimestamp="2026-04-28 20:10:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:10:34.678722531 +0000 UTC m=+3260.975077897" watchObservedRunningTime="2026-04-28 20:10:34.679858951 +0000 UTC m=+3260.976214315" Apr 28 20:10:35.662492 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:35.662446 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" podUID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 28 20:10:40.667638 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:40.667608 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" Apr 28 20:10:40.668693 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:40.668658 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" podUID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 28 20:10:50.669584 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:10:50.669539 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" podUID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 28 20:11:00.669290 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:00.669253 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" podUID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 28 20:11:10.668570 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:10.668531 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" podUID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 28 20:11:20.669118 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:20.669078 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" podUID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 28 20:11:30.669085 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:30.669005 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" podUID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 28 20:11:40.670101 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:40.670071 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" Apr 28 20:11:48.730617 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:48.730583 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr"] Apr 28 20:11:48.731082 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:48.730903 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" podUID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerName="kserve-container" containerID="cri-o://a4dda44fab81ac9fe00a767737d1f297459d2608b324ca77c3f1e93e2c3cfcca" gracePeriod=30 Apr 28 20:11:48.731082 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:48.730959 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" podUID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerName="kube-rbac-proxy" containerID="cri-o://66c953f88079b3730c29a0ad6ef78bc7a36865655c8acca760463b1c09cc3aff" gracePeriod=30 Apr 28 20:11:48.872651 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:48.872620 2539 generic.go:358] "Generic (PLEG): container finished" podID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerID="66c953f88079b3730c29a0ad6ef78bc7a36865655c8acca760463b1c09cc3aff" exitCode=2 Apr 28 20:11:48.872803 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:48.872669 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" event={"ID":"f62f653e-3c7b-4fbe-84b4-fa023096aebe","Type":"ContainerDied","Data":"66c953f88079b3730c29a0ad6ef78bc7a36865655c8acca760463b1c09cc3aff"} Apr 28 20:11:50.662708 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:50.662664 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" podUID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.35:8643/healthz\": dial tcp 10.133.0.35:8643: connect: connection refused" Apr 28 20:11:50.669034 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:50.669005 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" podUID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 28 20:11:52.472954 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.472917 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" Apr 28 20:11:52.634147 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.634055 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f62f653e-3c7b-4fbe-84b4-fa023096aebe-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"f62f653e-3c7b-4fbe-84b4-fa023096aebe\" (UID: \"f62f653e-3c7b-4fbe-84b4-fa023096aebe\") " Apr 28 20:11:52.634320 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.634170 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj67q\" (UniqueName: \"kubernetes.io/projected/f62f653e-3c7b-4fbe-84b4-fa023096aebe-kube-api-access-nj67q\") pod \"f62f653e-3c7b-4fbe-84b4-fa023096aebe\" (UID: \"f62f653e-3c7b-4fbe-84b4-fa023096aebe\") " Apr 28 20:11:52.634320 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.634202 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f62f653e-3c7b-4fbe-84b4-fa023096aebe-proxy-tls\") pod \"f62f653e-3c7b-4fbe-84b4-fa023096aebe\" (UID: \"f62f653e-3c7b-4fbe-84b4-fa023096aebe\") " Apr 28 20:11:52.634320 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.634230 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f62f653e-3c7b-4fbe-84b4-fa023096aebe-kserve-provision-location\") pod \"f62f653e-3c7b-4fbe-84b4-fa023096aebe\" (UID: \"f62f653e-3c7b-4fbe-84b4-fa023096aebe\") " Apr 28 20:11:52.634548 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.634517 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f62f653e-3c7b-4fbe-84b4-fa023096aebe-isvc-xgboost-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-runtime-kube-rbac-proxy-sar-config") pod "f62f653e-3c7b-4fbe-84b4-fa023096aebe" (UID: "f62f653e-3c7b-4fbe-84b4-fa023096aebe"). InnerVolumeSpecName "isvc-xgboost-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:11:52.634613 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.634590 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f62f653e-3c7b-4fbe-84b4-fa023096aebe-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f62f653e-3c7b-4fbe-84b4-fa023096aebe" (UID: "f62f653e-3c7b-4fbe-84b4-fa023096aebe"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:11:52.636303 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.636281 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f62f653e-3c7b-4fbe-84b4-fa023096aebe-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f62f653e-3c7b-4fbe-84b4-fa023096aebe" (UID: "f62f653e-3c7b-4fbe-84b4-fa023096aebe"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:11:52.636368 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.636349 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f62f653e-3c7b-4fbe-84b4-fa023096aebe-kube-api-access-nj67q" (OuterVolumeSpecName: "kube-api-access-nj67q") pod "f62f653e-3c7b-4fbe-84b4-fa023096aebe" (UID: "f62f653e-3c7b-4fbe-84b4-fa023096aebe"). InnerVolumeSpecName "kube-api-access-nj67q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:11:52.735634 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.735597 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nj67q\" (UniqueName: \"kubernetes.io/projected/f62f653e-3c7b-4fbe-84b4-fa023096aebe-kube-api-access-nj67q\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:11:52.735634 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.735629 2539 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f62f653e-3c7b-4fbe-84b4-fa023096aebe-proxy-tls\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:11:52.735634 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.735640 2539 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f62f653e-3c7b-4fbe-84b4-fa023096aebe-kserve-provision-location\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:11:52.735866 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.735649 2539 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f62f653e-3c7b-4fbe-84b4-fa023096aebe-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:11:52.886636 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.886544 2539 generic.go:358] "Generic (PLEG): container finished" podID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerID="a4dda44fab81ac9fe00a767737d1f297459d2608b324ca77c3f1e93e2c3cfcca" exitCode=0 Apr 28 20:11:52.886636 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.886582 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" event={"ID":"f62f653e-3c7b-4fbe-84b4-fa023096aebe","Type":"ContainerDied","Data":"a4dda44fab81ac9fe00a767737d1f297459d2608b324ca77c3f1e93e2c3cfcca"} Apr 28 20:11:52.886636 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.886606 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" event={"ID":"f62f653e-3c7b-4fbe-84b4-fa023096aebe","Type":"ContainerDied","Data":"5bc59964e94f80200d2d6ae0942cd22f681da7e96125f8021e737f908c083f8e"} Apr 28 20:11:52.886636 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.886623 2539 scope.go:117] "RemoveContainer" containerID="66c953f88079b3730c29a0ad6ef78bc7a36865655c8acca760463b1c09cc3aff" Apr 28 20:11:52.886636 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.886628 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr" Apr 28 20:11:52.894793 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.894768 2539 scope.go:117] "RemoveContainer" containerID="a4dda44fab81ac9fe00a767737d1f297459d2608b324ca77c3f1e93e2c3cfcca" Apr 28 20:11:52.901741 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.901723 2539 scope.go:117] "RemoveContainer" containerID="2a0475558dfaa04fb1e3c50f5a3b17c68c3595c425967beef573e9b50746c0de" Apr 28 20:11:52.908745 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.908718 2539 scope.go:117] "RemoveContainer" containerID="66c953f88079b3730c29a0ad6ef78bc7a36865655c8acca760463b1c09cc3aff" Apr 28 20:11:52.909166 ip-10-0-143-206 kubenswrapper[2539]: E0428 20:11:52.909025 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66c953f88079b3730c29a0ad6ef78bc7a36865655c8acca760463b1c09cc3aff\": container with ID starting with 66c953f88079b3730c29a0ad6ef78bc7a36865655c8acca760463b1c09cc3aff not found: ID does not exist" containerID="66c953f88079b3730c29a0ad6ef78bc7a36865655c8acca760463b1c09cc3aff" Apr 28 20:11:52.909166 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.909066 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66c953f88079b3730c29a0ad6ef78bc7a36865655c8acca760463b1c09cc3aff"} err="failed to get container status \"66c953f88079b3730c29a0ad6ef78bc7a36865655c8acca760463b1c09cc3aff\": rpc error: code = NotFound desc = could not find container \"66c953f88079b3730c29a0ad6ef78bc7a36865655c8acca760463b1c09cc3aff\": container with ID starting with 66c953f88079b3730c29a0ad6ef78bc7a36865655c8acca760463b1c09cc3aff not found: ID does not exist" Apr 28 20:11:52.909166 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.909094 2539 scope.go:117] "RemoveContainer" containerID="a4dda44fab81ac9fe00a767737d1f297459d2608b324ca77c3f1e93e2c3cfcca" Apr 28 20:11:52.909397 ip-10-0-143-206 kubenswrapper[2539]: E0428 20:11:52.909350 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4dda44fab81ac9fe00a767737d1f297459d2608b324ca77c3f1e93e2c3cfcca\": container with ID starting with a4dda44fab81ac9fe00a767737d1f297459d2608b324ca77c3f1e93e2c3cfcca not found: ID does not exist" containerID="a4dda44fab81ac9fe00a767737d1f297459d2608b324ca77c3f1e93e2c3cfcca" Apr 28 20:11:52.909443 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.909402 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4dda44fab81ac9fe00a767737d1f297459d2608b324ca77c3f1e93e2c3cfcca"} err="failed to get container status \"a4dda44fab81ac9fe00a767737d1f297459d2608b324ca77c3f1e93e2c3cfcca\": rpc error: code = NotFound desc = could not find container \"a4dda44fab81ac9fe00a767737d1f297459d2608b324ca77c3f1e93e2c3cfcca\": container with ID starting with a4dda44fab81ac9fe00a767737d1f297459d2608b324ca77c3f1e93e2c3cfcca not found: ID does not exist" Apr 28 20:11:52.909484 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.909427 2539 scope.go:117] "RemoveContainer" containerID="2a0475558dfaa04fb1e3c50f5a3b17c68c3595c425967beef573e9b50746c0de" Apr 28 20:11:52.909762 ip-10-0-143-206 kubenswrapper[2539]: E0428 20:11:52.909742 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a0475558dfaa04fb1e3c50f5a3b17c68c3595c425967beef573e9b50746c0de\": container with ID starting with 2a0475558dfaa04fb1e3c50f5a3b17c68c3595c425967beef573e9b50746c0de not found: ID does not exist" containerID="2a0475558dfaa04fb1e3c50f5a3b17c68c3595c425967beef573e9b50746c0de" Apr 28 20:11:52.909830 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.909768 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a0475558dfaa04fb1e3c50f5a3b17c68c3595c425967beef573e9b50746c0de"} err="failed to get container status \"2a0475558dfaa04fb1e3c50f5a3b17c68c3595c425967beef573e9b50746c0de\": rpc error: code = NotFound desc = could not find container \"2a0475558dfaa04fb1e3c50f5a3b17c68c3595c425967beef573e9b50746c0de\": container with ID starting with 2a0475558dfaa04fb1e3c50f5a3b17c68c3595c425967beef573e9b50746c0de not found: ID does not exist" Apr 28 20:11:52.910959 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.910937 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr"] Apr 28 20:11:52.912079 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:52.912061 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-w5ldr"] Apr 28 20:11:54.271739 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:54.271706 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" path="/var/lib/kubelet/pods/f62f653e-3c7b-4fbe-84b4-fa023096aebe/volumes" Apr 28 20:11:55.529907 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:55.529875 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 20:11:55.531323 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:11:55.531302 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 20:12:49.164085 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.164051 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk"] Apr 28 20:12:49.164581 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.164341 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerName="kube-rbac-proxy" Apr 28 20:12:49.164581 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.164351 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerName="kube-rbac-proxy" Apr 28 20:12:49.164581 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.164363 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerName="storage-initializer" Apr 28 20:12:49.164581 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.164369 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerName="storage-initializer" Apr 28 20:12:49.164581 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.164394 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerName="kserve-container" Apr 28 20:12:49.164581 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.164401 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerName="kserve-container" Apr 28 20:12:49.164581 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.164451 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerName="kube-rbac-proxy" Apr 28 20:12:49.164581 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.164459 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="f62f653e-3c7b-4fbe-84b4-fa023096aebe" containerName="kserve-container" Apr 28 20:12:49.167404 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.167385 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" Apr 28 20:12:49.173941 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.173861 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 28 20:12:49.174324 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.174284 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-predictor-serving-cert\"" Apr 28 20:12:49.174448 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.174405 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 28 20:12:49.174448 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.174437 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-44jch\"" Apr 28 20:12:49.174562 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.174489 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 20:12:49.181746 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.181722 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk"] Apr 28 20:12:49.296925 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.296886 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da8b7841-ebf3-4483-bda6-39e4f1627697-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk\" (UID: \"da8b7841-ebf3-4483-bda6-39e4f1627697\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" Apr 28 20:12:49.297114 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.296974 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4h9j\" (UniqueName: \"kubernetes.io/projected/da8b7841-ebf3-4483-bda6-39e4f1627697-kube-api-access-r4h9j\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk\" (UID: \"da8b7841-ebf3-4483-bda6-39e4f1627697\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" Apr 28 20:12:49.297114 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.297053 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da8b7841-ebf3-4483-bda6-39e4f1627697-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk\" (UID: \"da8b7841-ebf3-4483-bda6-39e4f1627697\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" Apr 28 20:12:49.297114 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.297099 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da8b7841-ebf3-4483-bda6-39e4f1627697-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk\" (UID: \"da8b7841-ebf3-4483-bda6-39e4f1627697\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" Apr 28 20:12:49.397680 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.397640 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da8b7841-ebf3-4483-bda6-39e4f1627697-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk\" (UID: \"da8b7841-ebf3-4483-bda6-39e4f1627697\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" Apr 28 20:12:49.397680 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.397684 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4h9j\" (UniqueName: \"kubernetes.io/projected/da8b7841-ebf3-4483-bda6-39e4f1627697-kube-api-access-r4h9j\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk\" (UID: \"da8b7841-ebf3-4483-bda6-39e4f1627697\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" Apr 28 20:12:49.397973 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.397721 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da8b7841-ebf3-4483-bda6-39e4f1627697-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk\" (UID: \"da8b7841-ebf3-4483-bda6-39e4f1627697\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" Apr 28 20:12:49.397973 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.397748 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da8b7841-ebf3-4483-bda6-39e4f1627697-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk\" (UID: \"da8b7841-ebf3-4483-bda6-39e4f1627697\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" Apr 28 20:12:49.398208 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.398182 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da8b7841-ebf3-4483-bda6-39e4f1627697-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk\" (UID: \"da8b7841-ebf3-4483-bda6-39e4f1627697\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" Apr 28 20:12:49.398445 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.398422 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da8b7841-ebf3-4483-bda6-39e4f1627697-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk\" (UID: \"da8b7841-ebf3-4483-bda6-39e4f1627697\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" Apr 28 20:12:49.400193 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.400177 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da8b7841-ebf3-4483-bda6-39e4f1627697-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk\" (UID: \"da8b7841-ebf3-4483-bda6-39e4f1627697\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" Apr 28 20:12:49.406366 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.406325 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4h9j\" (UniqueName: \"kubernetes.io/projected/da8b7841-ebf3-4483-bda6-39e4f1627697-kube-api-access-r4h9j\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk\" (UID: \"da8b7841-ebf3-4483-bda6-39e4f1627697\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" Apr 28 20:12:49.477405 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.477304 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" Apr 28 20:12:49.605104 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:49.601517 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk"] Apr 28 20:12:50.042869 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:50.042831 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" event={"ID":"da8b7841-ebf3-4483-bda6-39e4f1627697","Type":"ContainerStarted","Data":"f037fa95887207c2b28eb80d79637f8e50c31b5cd5d9ef4d78f8fa980838bc94"} Apr 28 20:12:50.042869 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:50.042872 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" event={"ID":"da8b7841-ebf3-4483-bda6-39e4f1627697","Type":"ContainerStarted","Data":"f83983d2d5c20894702f90c970196569d7c936b268a28941b2b0baf5f51010d0"} Apr 28 20:12:54.055672 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:54.055639 2539 generic.go:358] "Generic (PLEG): container finished" podID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerID="f037fa95887207c2b28eb80d79637f8e50c31b5cd5d9ef4d78f8fa980838bc94" exitCode=0 Apr 28 20:12:54.056048 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:54.055684 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" event={"ID":"da8b7841-ebf3-4483-bda6-39e4f1627697","Type":"ContainerDied","Data":"f037fa95887207c2b28eb80d79637f8e50c31b5cd5d9ef4d78f8fa980838bc94"} Apr 28 20:12:55.060643 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:55.060600 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" event={"ID":"da8b7841-ebf3-4483-bda6-39e4f1627697","Type":"ContainerStarted","Data":"cf480c34c0945866cc171752fdc0d86f0f690d4e6d69f6c626560e3a6f9ef917"} Apr 28 20:12:55.061138 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:55.060652 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" event={"ID":"da8b7841-ebf3-4483-bda6-39e4f1627697","Type":"ContainerStarted","Data":"d58e8d7029511c07829d4e5dd81defdf36c481206f79405f932ff3cc1c25142d"} Apr 28 20:12:55.061138 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:55.060954 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" Apr 28 20:12:55.061138 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:55.060984 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" Apr 28 20:12:55.062263 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:55.062239 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" podUID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 28 20:12:55.104329 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:55.104275 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" podStartSLOduration=6.104259825 podStartE2EDuration="6.104259825s" podCreationTimestamp="2026-04-28 20:12:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:12:55.103757754 +0000 UTC m=+3401.400113117" watchObservedRunningTime="2026-04-28 20:12:55.104259825 +0000 UTC m=+3401.400615252" Apr 28 20:12:56.063605 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:12:56.063548 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" podUID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 28 20:13:01.067969 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:13:01.067940 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" Apr 28 20:13:01.068607 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:13:01.068581 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" podUID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 28 20:13:11.068973 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:13:11.068930 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" podUID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 28 20:13:21.068702 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:13:21.068655 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" podUID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 28 20:13:31.069039 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:13:31.068997 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" podUID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 28 20:13:41.068472 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:13:41.068432 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" podUID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 28 20:13:51.068909 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:13:51.068866 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" podUID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 28 20:14:01.069204 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:01.069174 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" Apr 28 20:14:09.072193 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:09.072162 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk"] Apr 28 20:14:09.072634 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:09.072534 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" podUID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerName="kserve-container" containerID="cri-o://d58e8d7029511c07829d4e5dd81defdf36c481206f79405f932ff3cc1c25142d" gracePeriod=30 Apr 28 20:14:09.072634 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:09.072560 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" podUID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerName="kube-rbac-proxy" containerID="cri-o://cf480c34c0945866cc171752fdc0d86f0f690d4e6d69f6c626560e3a6f9ef917" gracePeriod=30 Apr 28 20:14:09.268904 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:09.268870 2539 generic.go:358] "Generic (PLEG): container finished" podID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerID="cf480c34c0945866cc171752fdc0d86f0f690d4e6d69f6c626560e3a6f9ef917" exitCode=2 Apr 28 20:14:09.269072 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:09.268927 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" event={"ID":"da8b7841-ebf3-4483-bda6-39e4f1627697","Type":"ContainerDied","Data":"cf480c34c0945866cc171752fdc0d86f0f690d4e6d69f6c626560e3a6f9ef917"} Apr 28 20:14:11.064670 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:11.064627 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" podUID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.36:8643/healthz\": dial tcp 10.133.0.36:8643: connect: connection refused" Apr 28 20:14:11.069070 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:11.069039 2539 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" podUID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 28 20:14:12.812953 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:12.812921 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" Apr 28 20:14:12.878988 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:12.878895 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da8b7841-ebf3-4483-bda6-39e4f1627697-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"da8b7841-ebf3-4483-bda6-39e4f1627697\" (UID: \"da8b7841-ebf3-4483-bda6-39e4f1627697\") " Apr 28 20:14:12.878988 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:12.878940 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4h9j\" (UniqueName: \"kubernetes.io/projected/da8b7841-ebf3-4483-bda6-39e4f1627697-kube-api-access-r4h9j\") pod \"da8b7841-ebf3-4483-bda6-39e4f1627697\" (UID: \"da8b7841-ebf3-4483-bda6-39e4f1627697\") " Apr 28 20:14:12.878988 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:12.878971 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da8b7841-ebf3-4483-bda6-39e4f1627697-kserve-provision-location\") pod \"da8b7841-ebf3-4483-bda6-39e4f1627697\" (UID: \"da8b7841-ebf3-4483-bda6-39e4f1627697\") " Apr 28 20:14:12.879256 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:12.879043 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da8b7841-ebf3-4483-bda6-39e4f1627697-proxy-tls\") pod \"da8b7841-ebf3-4483-bda6-39e4f1627697\" (UID: \"da8b7841-ebf3-4483-bda6-39e4f1627697\") " Apr 28 20:14:12.879306 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:12.879282 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da8b7841-ebf3-4483-bda6-39e4f1627697-isvc-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-kube-rbac-proxy-sar-config") pod "da8b7841-ebf3-4483-bda6-39e4f1627697" (UID: "da8b7841-ebf3-4483-bda6-39e4f1627697"). InnerVolumeSpecName "isvc-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:14:12.879306 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:12.879293 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da8b7841-ebf3-4483-bda6-39e4f1627697-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "da8b7841-ebf3-4483-bda6-39e4f1627697" (UID: "da8b7841-ebf3-4483-bda6-39e4f1627697"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:14:12.881074 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:12.881048 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da8b7841-ebf3-4483-bda6-39e4f1627697-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "da8b7841-ebf3-4483-bda6-39e4f1627697" (UID: "da8b7841-ebf3-4483-bda6-39e4f1627697"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:14:12.881147 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:12.881054 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da8b7841-ebf3-4483-bda6-39e4f1627697-kube-api-access-r4h9j" (OuterVolumeSpecName: "kube-api-access-r4h9j") pod "da8b7841-ebf3-4483-bda6-39e4f1627697" (UID: "da8b7841-ebf3-4483-bda6-39e4f1627697"). InnerVolumeSpecName "kube-api-access-r4h9j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:14:12.980109 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:12.980077 2539 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da8b7841-ebf3-4483-bda6-39e4f1627697-kserve-provision-location\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:14:12.980109 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:12.980105 2539 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da8b7841-ebf3-4483-bda6-39e4f1627697-proxy-tls\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:14:12.980109 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:12.980118 2539 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da8b7841-ebf3-4483-bda6-39e4f1627697-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:14:12.980340 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:12.980128 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r4h9j\" (UniqueName: \"kubernetes.io/projected/da8b7841-ebf3-4483-bda6-39e4f1627697-kube-api-access-r4h9j\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:14:13.280054 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:13.280021 2539 generic.go:358] "Generic (PLEG): container finished" podID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerID="d58e8d7029511c07829d4e5dd81defdf36c481206f79405f932ff3cc1c25142d" exitCode=0 Apr 28 20:14:13.280225 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:13.280092 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" Apr 28 20:14:13.280225 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:13.280107 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" event={"ID":"da8b7841-ebf3-4483-bda6-39e4f1627697","Type":"ContainerDied","Data":"d58e8d7029511c07829d4e5dd81defdf36c481206f79405f932ff3cc1c25142d"} Apr 28 20:14:13.280225 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:13.280145 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk" event={"ID":"da8b7841-ebf3-4483-bda6-39e4f1627697","Type":"ContainerDied","Data":"f83983d2d5c20894702f90c970196569d7c936b268a28941b2b0baf5f51010d0"} Apr 28 20:14:13.280225 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:13.280160 2539 scope.go:117] "RemoveContainer" containerID="cf480c34c0945866cc171752fdc0d86f0f690d4e6d69f6c626560e3a6f9ef917" Apr 28 20:14:13.288466 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:13.288444 2539 scope.go:117] "RemoveContainer" containerID="d58e8d7029511c07829d4e5dd81defdf36c481206f79405f932ff3cc1c25142d" Apr 28 20:14:13.295872 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:13.295848 2539 scope.go:117] "RemoveContainer" containerID="f037fa95887207c2b28eb80d79637f8e50c31b5cd5d9ef4d78f8fa980838bc94" Apr 28 20:14:13.303011 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:13.302989 2539 scope.go:117] "RemoveContainer" containerID="cf480c34c0945866cc171752fdc0d86f0f690d4e6d69f6c626560e3a6f9ef917" Apr 28 20:14:13.303293 ip-10-0-143-206 kubenswrapper[2539]: E0428 20:14:13.303269 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf480c34c0945866cc171752fdc0d86f0f690d4e6d69f6c626560e3a6f9ef917\": container with ID starting with cf480c34c0945866cc171752fdc0d86f0f690d4e6d69f6c626560e3a6f9ef917 not found: ID does not exist" containerID="cf480c34c0945866cc171752fdc0d86f0f690d4e6d69f6c626560e3a6f9ef917" Apr 28 20:14:13.303368 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:13.303305 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf480c34c0945866cc171752fdc0d86f0f690d4e6d69f6c626560e3a6f9ef917"} err="failed to get container status \"cf480c34c0945866cc171752fdc0d86f0f690d4e6d69f6c626560e3a6f9ef917\": rpc error: code = NotFound desc = could not find container \"cf480c34c0945866cc171752fdc0d86f0f690d4e6d69f6c626560e3a6f9ef917\": container with ID starting with cf480c34c0945866cc171752fdc0d86f0f690d4e6d69f6c626560e3a6f9ef917 not found: ID does not exist" Apr 28 20:14:13.303368 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:13.303326 2539 scope.go:117] "RemoveContainer" containerID="d58e8d7029511c07829d4e5dd81defdf36c481206f79405f932ff3cc1c25142d" Apr 28 20:14:13.303723 ip-10-0-143-206 kubenswrapper[2539]: E0428 20:14:13.303705 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d58e8d7029511c07829d4e5dd81defdf36c481206f79405f932ff3cc1c25142d\": container with ID starting with d58e8d7029511c07829d4e5dd81defdf36c481206f79405f932ff3cc1c25142d not found: ID does not exist" containerID="d58e8d7029511c07829d4e5dd81defdf36c481206f79405f932ff3cc1c25142d" Apr 28 20:14:13.303826 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:13.303728 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d58e8d7029511c07829d4e5dd81defdf36c481206f79405f932ff3cc1c25142d"} err="failed to get container status \"d58e8d7029511c07829d4e5dd81defdf36c481206f79405f932ff3cc1c25142d\": rpc error: code = NotFound desc = could not find container \"d58e8d7029511c07829d4e5dd81defdf36c481206f79405f932ff3cc1c25142d\": container with ID starting with d58e8d7029511c07829d4e5dd81defdf36c481206f79405f932ff3cc1c25142d not found: ID does not exist" Apr 28 20:14:13.303826 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:13.303743 2539 scope.go:117] "RemoveContainer" containerID="f037fa95887207c2b28eb80d79637f8e50c31b5cd5d9ef4d78f8fa980838bc94" Apr 28 20:14:13.303995 ip-10-0-143-206 kubenswrapper[2539]: E0428 20:14:13.303979 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f037fa95887207c2b28eb80d79637f8e50c31b5cd5d9ef4d78f8fa980838bc94\": container with ID starting with f037fa95887207c2b28eb80d79637f8e50c31b5cd5d9ef4d78f8fa980838bc94 not found: ID does not exist" containerID="f037fa95887207c2b28eb80d79637f8e50c31b5cd5d9ef4d78f8fa980838bc94" Apr 28 20:14:13.304055 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:13.304004 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f037fa95887207c2b28eb80d79637f8e50c31b5cd5d9ef4d78f8fa980838bc94"} err="failed to get container status \"f037fa95887207c2b28eb80d79637f8e50c31b5cd5d9ef4d78f8fa980838bc94\": rpc error: code = NotFound desc = could not find container \"f037fa95887207c2b28eb80d79637f8e50c31b5cd5d9ef4d78f8fa980838bc94\": container with ID starting with f037fa95887207c2b28eb80d79637f8e50c31b5cd5d9ef4d78f8fa980838bc94 not found: ID does not exist" Apr 28 20:14:13.304055 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:13.303979 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk"] Apr 28 20:14:13.307523 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:13.307497 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-k9kkk"] Apr 28 20:14:14.271246 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:14:14.271212 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da8b7841-ebf3-4483-bda6-39e4f1627697" path="/var/lib/kubelet/pods/da8b7841-ebf3-4483-bda6-39e4f1627697/volumes" Apr 28 20:16:55.547553 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:16:55.547519 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 20:16:55.550066 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:16:55.550044 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 20:19:46.688130 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:46.688096 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sncjd/must-gather-qhjwd"] Apr 28 20:19:46.688766 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:46.688535 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerName="kserve-container" Apr 28 20:19:46.688766 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:46.688554 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerName="kserve-container" Apr 28 20:19:46.688766 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:46.688569 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerName="storage-initializer" Apr 28 20:19:46.688766 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:46.688578 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerName="storage-initializer" Apr 28 20:19:46.688766 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:46.688595 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerName="kube-rbac-proxy" Apr 28 20:19:46.688766 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:46.688604 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerName="kube-rbac-proxy" Apr 28 20:19:46.688766 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:46.688663 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerName="kube-rbac-proxy" Apr 28 20:19:46.688766 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:46.688678 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="da8b7841-ebf3-4483-bda6-39e4f1627697" containerName="kserve-container" Apr 28 20:19:46.691900 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:46.691882 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sncjd/must-gather-qhjwd" Apr 28 20:19:46.693956 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:46.693937 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-sncjd\"/\"kube-root-ca.crt\"" Apr 28 20:19:46.694051 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:46.694028 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-sncjd\"/\"openshift-service-ca.crt\"" Apr 28 20:19:46.700847 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:46.700829 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sncjd/must-gather-qhjwd"] Apr 28 20:19:46.753067 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:46.753041 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs9cv\" (UniqueName: \"kubernetes.io/projected/07341600-724f-499f-8dc7-6000191e7d13-kube-api-access-gs9cv\") pod \"must-gather-qhjwd\" (UID: \"07341600-724f-499f-8dc7-6000191e7d13\") " pod="openshift-must-gather-sncjd/must-gather-qhjwd" Apr 28 20:19:46.753204 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:46.753087 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/07341600-724f-499f-8dc7-6000191e7d13-must-gather-output\") pod \"must-gather-qhjwd\" (UID: \"07341600-724f-499f-8dc7-6000191e7d13\") " pod="openshift-must-gather-sncjd/must-gather-qhjwd" Apr 28 20:19:46.853712 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:46.853679 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs9cv\" (UniqueName: \"kubernetes.io/projected/07341600-724f-499f-8dc7-6000191e7d13-kube-api-access-gs9cv\") pod \"must-gather-qhjwd\" (UID: \"07341600-724f-499f-8dc7-6000191e7d13\") " pod="openshift-must-gather-sncjd/must-gather-qhjwd" Apr 28 20:19:46.853898 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:46.853732 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/07341600-724f-499f-8dc7-6000191e7d13-must-gather-output\") pod \"must-gather-qhjwd\" (UID: \"07341600-724f-499f-8dc7-6000191e7d13\") " pod="openshift-must-gather-sncjd/must-gather-qhjwd" Apr 28 20:19:46.854109 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:46.854092 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/07341600-724f-499f-8dc7-6000191e7d13-must-gather-output\") pod \"must-gather-qhjwd\" (UID: \"07341600-724f-499f-8dc7-6000191e7d13\") " pod="openshift-must-gather-sncjd/must-gather-qhjwd" Apr 28 20:19:46.861151 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:46.861118 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs9cv\" (UniqueName: \"kubernetes.io/projected/07341600-724f-499f-8dc7-6000191e7d13-kube-api-access-gs9cv\") pod \"must-gather-qhjwd\" (UID: \"07341600-724f-499f-8dc7-6000191e7d13\") " pod="openshift-must-gather-sncjd/must-gather-qhjwd" Apr 28 20:19:47.013273 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:47.013249 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sncjd/must-gather-qhjwd" Apr 28 20:19:47.125890 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:47.125861 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sncjd/must-gather-qhjwd"] Apr 28 20:19:47.128842 ip-10-0-143-206 kubenswrapper[2539]: W0428 20:19:47.128811 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07341600_724f_499f_8dc7_6000191e7d13.slice/crio-330a624ccb2d31f58370aa7e3994311255151a59ca3c6c73ce393733ae320a0e WatchSource:0}: Error finding container 330a624ccb2d31f58370aa7e3994311255151a59ca3c6c73ce393733ae320a0e: Status 404 returned error can't find the container with id 330a624ccb2d31f58370aa7e3994311255151a59ca3c6c73ce393733ae320a0e Apr 28 20:19:47.130275 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:47.130261 2539 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 20:19:47.174751 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:47.174728 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sncjd/must-gather-qhjwd" event={"ID":"07341600-724f-499f-8dc7-6000191e7d13","Type":"ContainerStarted","Data":"330a624ccb2d31f58370aa7e3994311255151a59ca3c6c73ce393733ae320a0e"} Apr 28 20:19:52.191250 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:52.191057 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sncjd/must-gather-qhjwd" event={"ID":"07341600-724f-499f-8dc7-6000191e7d13","Type":"ContainerStarted","Data":"9ba22d6691c4140c0de3f29933b3222008898ec12816934672b44853937769b2"} Apr 28 20:19:53.195462 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:53.195425 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sncjd/must-gather-qhjwd" event={"ID":"07341600-724f-499f-8dc7-6000191e7d13","Type":"ContainerStarted","Data":"c6dbd138d091a5c56eb93af3c8cfe68d64014502e7139d8286832e613ccf73a8"} Apr 28 20:19:53.212663 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:19:53.212622 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sncjd/must-gather-qhjwd" podStartSLOduration=2.291640652 podStartE2EDuration="7.212608482s" podCreationTimestamp="2026-04-28 20:19:46 +0000 UTC" firstStartedPulling="2026-04-28 20:19:47.130419969 +0000 UTC m=+3813.426775310" lastFinishedPulling="2026-04-28 20:19:52.051387797 +0000 UTC m=+3818.347743140" observedRunningTime="2026-04-28 20:19:53.211496299 +0000 UTC m=+3819.507851661" watchObservedRunningTime="2026-04-28 20:19:53.212608482 +0000 UTC m=+3819.508963872" Apr 28 20:20:12.249666 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:12.249633 2539 generic.go:358] "Generic (PLEG): container finished" podID="07341600-724f-499f-8dc7-6000191e7d13" containerID="9ba22d6691c4140c0de3f29933b3222008898ec12816934672b44853937769b2" exitCode=0 Apr 28 20:20:12.250018 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:12.249680 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sncjd/must-gather-qhjwd" event={"ID":"07341600-724f-499f-8dc7-6000191e7d13","Type":"ContainerDied","Data":"9ba22d6691c4140c0de3f29933b3222008898ec12816934672b44853937769b2"} Apr 28 20:20:12.250018 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:12.249998 2539 scope.go:117] "RemoveContainer" containerID="9ba22d6691c4140c0de3f29933b3222008898ec12816934672b44853937769b2" Apr 28 20:20:13.206208 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:13.206155 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sncjd_must-gather-qhjwd_07341600-724f-499f-8dc7-6000191e7d13/gather/0.log" Apr 28 20:20:16.520779 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:16.520747 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-b4vxf_5325db29-356b-4407-92e1-5ad3950aa605/global-pull-secret-syncer/0.log" Apr 28 20:20:16.772024 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:16.771950 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-c9r6q_006eba5f-c69e-415e-b993-2a2c72ae4df3/konnectivity-agent/0.log" Apr 28 20:20:16.840055 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:16.840027 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-206.ec2.internal_6d10770512520c6fc076a2363490adc7/haproxy/0.log" Apr 28 20:20:18.660145 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:18.660111 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sncjd/must-gather-qhjwd"] Apr 28 20:20:18.660560 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:18.660317 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-sncjd/must-gather-qhjwd" podUID="07341600-724f-499f-8dc7-6000191e7d13" containerName="copy" containerID="cri-o://c6dbd138d091a5c56eb93af3c8cfe68d64014502e7139d8286832e613ccf73a8" gracePeriod=2 Apr 28 20:20:18.664669 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:18.664138 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sncjd/must-gather-qhjwd"] Apr 28 20:20:18.876817 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:18.876793 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sncjd_must-gather-qhjwd_07341600-724f-499f-8dc7-6000191e7d13/copy/0.log" Apr 28 20:20:18.877162 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:18.877147 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sncjd/must-gather-qhjwd" Apr 28 20:20:18.878641 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:18.878620 2539 status_manager.go:895] "Failed to get status for pod" podUID="07341600-724f-499f-8dc7-6000191e7d13" pod="openshift-must-gather-sncjd/must-gather-qhjwd" err="pods \"must-gather-qhjwd\" is forbidden: User \"system:node:ip-10-0-143-206.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-sncjd\": no relationship found between node 'ip-10-0-143-206.ec2.internal' and this object" Apr 28 20:20:19.032447 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:19.032422 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs9cv\" (UniqueName: \"kubernetes.io/projected/07341600-724f-499f-8dc7-6000191e7d13-kube-api-access-gs9cv\") pod \"07341600-724f-499f-8dc7-6000191e7d13\" (UID: \"07341600-724f-499f-8dc7-6000191e7d13\") " Apr 28 20:20:19.032610 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:19.032493 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/07341600-724f-499f-8dc7-6000191e7d13-must-gather-output\") pod \"07341600-724f-499f-8dc7-6000191e7d13\" (UID: \"07341600-724f-499f-8dc7-6000191e7d13\") " Apr 28 20:20:19.033800 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:19.033772 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07341600-724f-499f-8dc7-6000191e7d13-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "07341600-724f-499f-8dc7-6000191e7d13" (UID: "07341600-724f-499f-8dc7-6000191e7d13"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:20:19.034539 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:19.034519 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07341600-724f-499f-8dc7-6000191e7d13-kube-api-access-gs9cv" (OuterVolumeSpecName: "kube-api-access-gs9cv") pod "07341600-724f-499f-8dc7-6000191e7d13" (UID: "07341600-724f-499f-8dc7-6000191e7d13"). InnerVolumeSpecName "kube-api-access-gs9cv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:20:19.133918 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:19.133897 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gs9cv\" (UniqueName: \"kubernetes.io/projected/07341600-724f-499f-8dc7-6000191e7d13-kube-api-access-gs9cv\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:20:19.133918 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:19.133917 2539 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/07341600-724f-499f-8dc7-6000191e7d13-must-gather-output\") on node \"ip-10-0-143-206.ec2.internal\" DevicePath \"\"" Apr 28 20:20:19.270587 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:19.270563 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sncjd_must-gather-qhjwd_07341600-724f-499f-8dc7-6000191e7d13/copy/0.log" Apr 28 20:20:19.270964 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:19.270940 2539 generic.go:358] "Generic (PLEG): container finished" podID="07341600-724f-499f-8dc7-6000191e7d13" containerID="c6dbd138d091a5c56eb93af3c8cfe68d64014502e7139d8286832e613ccf73a8" exitCode=143 Apr 28 20:20:19.271071 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:19.271002 2539 scope.go:117] "RemoveContainer" containerID="c6dbd138d091a5c56eb93af3c8cfe68d64014502e7139d8286832e613ccf73a8" Apr 28 20:20:19.271071 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:19.271006 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sncjd/must-gather-qhjwd" Apr 28 20:20:19.272833 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:19.272793 2539 status_manager.go:895] "Failed to get status for pod" podUID="07341600-724f-499f-8dc7-6000191e7d13" pod="openshift-must-gather-sncjd/must-gather-qhjwd" err="pods \"must-gather-qhjwd\" is forbidden: User \"system:node:ip-10-0-143-206.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-sncjd\": no relationship found between node 'ip-10-0-143-206.ec2.internal' and this object" Apr 28 20:20:19.278884 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:19.278867 2539 scope.go:117] "RemoveContainer" containerID="9ba22d6691c4140c0de3f29933b3222008898ec12816934672b44853937769b2" Apr 28 20:20:19.280563 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:19.280535 2539 status_manager.go:895] "Failed to get status for pod" podUID="07341600-724f-499f-8dc7-6000191e7d13" pod="openshift-must-gather-sncjd/must-gather-qhjwd" err="pods \"must-gather-qhjwd\" is forbidden: User \"system:node:ip-10-0-143-206.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-sncjd\": no relationship found between node 'ip-10-0-143-206.ec2.internal' and this object" Apr 28 20:20:19.291005 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:19.290989 2539 scope.go:117] "RemoveContainer" containerID="c6dbd138d091a5c56eb93af3c8cfe68d64014502e7139d8286832e613ccf73a8" Apr 28 20:20:19.291257 ip-10-0-143-206 kubenswrapper[2539]: E0428 20:20:19.291236 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6dbd138d091a5c56eb93af3c8cfe68d64014502e7139d8286832e613ccf73a8\": container with ID starting with c6dbd138d091a5c56eb93af3c8cfe68d64014502e7139d8286832e613ccf73a8 not found: ID does not exist" containerID="c6dbd138d091a5c56eb93af3c8cfe68d64014502e7139d8286832e613ccf73a8" Apr 28 20:20:19.291303 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:19.291265 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6dbd138d091a5c56eb93af3c8cfe68d64014502e7139d8286832e613ccf73a8"} err="failed to get container status \"c6dbd138d091a5c56eb93af3c8cfe68d64014502e7139d8286832e613ccf73a8\": rpc error: code = NotFound desc = could not find container \"c6dbd138d091a5c56eb93af3c8cfe68d64014502e7139d8286832e613ccf73a8\": container with ID starting with c6dbd138d091a5c56eb93af3c8cfe68d64014502e7139d8286832e613ccf73a8 not found: ID does not exist" Apr 28 20:20:19.291303 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:19.291282 2539 scope.go:117] "RemoveContainer" containerID="9ba22d6691c4140c0de3f29933b3222008898ec12816934672b44853937769b2" Apr 28 20:20:19.291539 ip-10-0-143-206 kubenswrapper[2539]: E0428 20:20:19.291521 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ba22d6691c4140c0de3f29933b3222008898ec12816934672b44853937769b2\": container with ID starting with 9ba22d6691c4140c0de3f29933b3222008898ec12816934672b44853937769b2 not found: ID does not exist" containerID="9ba22d6691c4140c0de3f29933b3222008898ec12816934672b44853937769b2" Apr 28 20:20:19.291584 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:19.291544 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ba22d6691c4140c0de3f29933b3222008898ec12816934672b44853937769b2"} err="failed to get container status \"9ba22d6691c4140c0de3f29933b3222008898ec12816934672b44853937769b2\": rpc error: code = NotFound desc = could not find container \"9ba22d6691c4140c0de3f29933b3222008898ec12816934672b44853937769b2\": container with ID starting with 9ba22d6691c4140c0de3f29933b3222008898ec12816934672b44853937769b2 not found: ID does not exist" Apr 28 20:20:20.185211 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:20.185183 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-f68884b84-56nbs_ae1d9a61-7e7d-4030-a91a-57583b894f03/metrics-server/0.log" Apr 28 20:20:20.246553 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:20.246525 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dxng4_6af9d93a-8042-4a3f-a6d6-b7603c690151/node-exporter/0.log" Apr 28 20:20:20.271066 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:20.271037 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07341600-724f-499f-8dc7-6000191e7d13" path="/var/lib/kubelet/pods/07341600-724f-499f-8dc7-6000191e7d13/volumes" Apr 28 20:20:20.272162 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:20.272137 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dxng4_6af9d93a-8042-4a3f-a6d6-b7603c690151/kube-rbac-proxy/0.log" Apr 28 20:20:20.298298 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:20.298279 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dxng4_6af9d93a-8042-4a3f-a6d6-b7603c690151/init-textfile/0.log" Apr 28 20:20:20.575732 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:20.575701 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b64d7f5a-30af-486f-af17-7e12f3783d7d/prometheus/0.log" Apr 28 20:20:20.592292 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:20.592272 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b64d7f5a-30af-486f-af17-7e12f3783d7d/config-reloader/0.log" Apr 28 20:20:20.612527 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:20.612504 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b64d7f5a-30af-486f-af17-7e12f3783d7d/thanos-sidecar/0.log" Apr 28 20:20:20.633553 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:20.633532 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b64d7f5a-30af-486f-af17-7e12f3783d7d/kube-rbac-proxy-web/0.log" Apr 28 20:20:20.653995 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:20.653975 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b64d7f5a-30af-486f-af17-7e12f3783d7d/kube-rbac-proxy/0.log" Apr 28 20:20:20.674165 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:20.674146 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b64d7f5a-30af-486f-af17-7e12f3783d7d/kube-rbac-proxy-thanos/0.log" Apr 28 20:20:20.698471 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:20.698445 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b64d7f5a-30af-486f-af17-7e12f3783d7d/init-config-reloader/0.log" Apr 28 20:20:23.564369 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.564339 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn"] Apr 28 20:20:23.564747 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.564621 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07341600-724f-499f-8dc7-6000191e7d13" containerName="gather" Apr 28 20:20:23.564747 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.564632 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="07341600-724f-499f-8dc7-6000191e7d13" containerName="gather" Apr 28 20:20:23.564747 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.564658 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07341600-724f-499f-8dc7-6000191e7d13" containerName="copy" Apr 28 20:20:23.564747 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.564664 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="07341600-724f-499f-8dc7-6000191e7d13" containerName="copy" Apr 28 20:20:23.564747 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.564702 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="07341600-724f-499f-8dc7-6000191e7d13" containerName="copy" Apr 28 20:20:23.564747 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.564709 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="07341600-724f-499f-8dc7-6000191e7d13" containerName="gather" Apr 28 20:20:23.566541 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.566526 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn" Apr 28 20:20:23.568729 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.568706 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-r9lxr\"/\"kube-root-ca.crt\"" Apr 28 20:20:23.568847 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.568707 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-r9lxr\"/\"openshift-service-ca.crt\"" Apr 28 20:20:23.569731 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.569716 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-r9lxr\"/\"default-dockercfg-7h6qj\"" Apr 28 20:20:23.578034 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.578014 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn"] Apr 28 20:20:23.672602 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.672564 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swxh6\" (UniqueName: \"kubernetes.io/projected/b6d02c06-fbd0-457e-9e20-017cff439fee-kube-api-access-swxh6\") pod \"perf-node-gather-daemonset-hhlgn\" (UID: \"b6d02c06-fbd0-457e-9e20-017cff439fee\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn" Apr 28 20:20:23.672602 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.672602 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b6d02c06-fbd0-457e-9e20-017cff439fee-sys\") pod \"perf-node-gather-daemonset-hhlgn\" (UID: \"b6d02c06-fbd0-457e-9e20-017cff439fee\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn" Apr 28 20:20:23.672806 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.672635 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b6d02c06-fbd0-457e-9e20-017cff439fee-proc\") pod \"perf-node-gather-daemonset-hhlgn\" (UID: \"b6d02c06-fbd0-457e-9e20-017cff439fee\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn" Apr 28 20:20:23.672806 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.672653 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b6d02c06-fbd0-457e-9e20-017cff439fee-lib-modules\") pod \"perf-node-gather-daemonset-hhlgn\" (UID: \"b6d02c06-fbd0-457e-9e20-017cff439fee\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn" Apr 28 20:20:23.672806 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.672667 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b6d02c06-fbd0-457e-9e20-017cff439fee-podres\") pod \"perf-node-gather-daemonset-hhlgn\" (UID: \"b6d02c06-fbd0-457e-9e20-017cff439fee\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn" Apr 28 20:20:23.774101 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.774067 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b6d02c06-fbd0-457e-9e20-017cff439fee-proc\") pod \"perf-node-gather-daemonset-hhlgn\" (UID: \"b6d02c06-fbd0-457e-9e20-017cff439fee\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn" Apr 28 20:20:23.774101 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.774102 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b6d02c06-fbd0-457e-9e20-017cff439fee-lib-modules\") pod \"perf-node-gather-daemonset-hhlgn\" (UID: \"b6d02c06-fbd0-457e-9e20-017cff439fee\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn" Apr 28 20:20:23.774294 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.774117 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b6d02c06-fbd0-457e-9e20-017cff439fee-podres\") pod \"perf-node-gather-daemonset-hhlgn\" (UID: \"b6d02c06-fbd0-457e-9e20-017cff439fee\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn" Apr 28 20:20:23.774294 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.774169 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swxh6\" (UniqueName: \"kubernetes.io/projected/b6d02c06-fbd0-457e-9e20-017cff439fee-kube-api-access-swxh6\") pod \"perf-node-gather-daemonset-hhlgn\" (UID: \"b6d02c06-fbd0-457e-9e20-017cff439fee\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn" Apr 28 20:20:23.774294 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.774183 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b6d02c06-fbd0-457e-9e20-017cff439fee-proc\") pod \"perf-node-gather-daemonset-hhlgn\" (UID: \"b6d02c06-fbd0-457e-9e20-017cff439fee\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn" Apr 28 20:20:23.774294 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.774195 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b6d02c06-fbd0-457e-9e20-017cff439fee-sys\") pod \"perf-node-gather-daemonset-hhlgn\" (UID: \"b6d02c06-fbd0-457e-9e20-017cff439fee\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn" Apr 28 20:20:23.774294 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.774243 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b6d02c06-fbd0-457e-9e20-017cff439fee-sys\") pod \"perf-node-gather-daemonset-hhlgn\" (UID: \"b6d02c06-fbd0-457e-9e20-017cff439fee\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn" Apr 28 20:20:23.774294 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.774252 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b6d02c06-fbd0-457e-9e20-017cff439fee-lib-modules\") pod \"perf-node-gather-daemonset-hhlgn\" (UID: \"b6d02c06-fbd0-457e-9e20-017cff439fee\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn" Apr 28 20:20:23.774520 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.774297 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b6d02c06-fbd0-457e-9e20-017cff439fee-podres\") pod \"perf-node-gather-daemonset-hhlgn\" (UID: \"b6d02c06-fbd0-457e-9e20-017cff439fee\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn" Apr 28 20:20:23.781513 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.781493 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swxh6\" (UniqueName: \"kubernetes.io/projected/b6d02c06-fbd0-457e-9e20-017cff439fee-kube-api-access-swxh6\") pod \"perf-node-gather-daemonset-hhlgn\" (UID: \"b6d02c06-fbd0-457e-9e20-017cff439fee\") " pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn" Apr 28 20:20:23.875928 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.875839 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn" Apr 28 20:20:23.989365 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:23.989336 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn"] Apr 28 20:20:23.991673 ip-10-0-143-206 kubenswrapper[2539]: W0428 20:20:23.991647 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb6d02c06_fbd0_457e_9e20_017cff439fee.slice/crio-c3b7a31f7dd395502a16ae43c9450c3e885ece165aa1e79f33497c476bf3e4fc WatchSource:0}: Error finding container c3b7a31f7dd395502a16ae43c9450c3e885ece165aa1e79f33497c476bf3e4fc: Status 404 returned error can't find the container with id c3b7a31f7dd395502a16ae43c9450c3e885ece165aa1e79f33497c476bf3e4fc Apr 28 20:20:24.216429 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:24.216356 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-d2nhx_f8173256-d810-4483-b373-4b19f554cbf6/dns/0.log" Apr 28 20:20:24.238463 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:24.238443 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-d2nhx_f8173256-d810-4483-b373-4b19f554cbf6/kube-rbac-proxy/0.log" Apr 28 20:20:24.285414 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:24.285347 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn" event={"ID":"b6d02c06-fbd0-457e-9e20-017cff439fee","Type":"ContainerStarted","Data":"790b7694ab0801bccee4efef0abbe194962b4f12d07c4a5ab60f67f0810ffea5"} Apr 28 20:20:24.285414 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:24.285412 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn" event={"ID":"b6d02c06-fbd0-457e-9e20-017cff439fee","Type":"ContainerStarted","Data":"c3b7a31f7dd395502a16ae43c9450c3e885ece165aa1e79f33497c476bf3e4fc"} Apr 28 20:20:24.285596 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:24.285534 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn" Apr 28 20:20:24.301880 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:24.301840 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn" podStartSLOduration=1.301828305 podStartE2EDuration="1.301828305s" podCreationTimestamp="2026-04-28 20:20:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:20:24.2998719 +0000 UTC m=+3850.596227262" watchObservedRunningTime="2026-04-28 20:20:24.301828305 +0000 UTC m=+3850.598183668" Apr 28 20:20:24.311365 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:24.311342 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4kjrn_5217b4ac-ec08-4f15-af88-99f26535e549/dns-node-resolver/0.log" Apr 28 20:20:24.772149 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:24.772122 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-85f54f8846-b82cj_44e01525-a6e2-451a-8b52-51306d0ab16f/registry/0.log" Apr 28 20:20:24.792790 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:24.792767 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-g4cj8_0daae916-5659-44ea-96b4-ed96cbfa9da3/node-ca/0.log" Apr 28 20:20:25.878365 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:25.878337 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-hl4tm_f2e61248-2e6c-4f91-806d-ba6a148c3b71/serve-healthcheck-canary/0.log" Apr 28 20:20:26.266386 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:26.266359 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-78qnv_5c7a7b2d-077e-4f5f-96e5-571525a4f600/kube-rbac-proxy/0.log" Apr 28 20:20:26.285517 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:26.285494 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-78qnv_5c7a7b2d-077e-4f5f-96e5-571525a4f600/exporter/0.log" Apr 28 20:20:26.307284 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:26.307260 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-78qnv_5c7a7b2d-077e-4f5f-96e5-571525a4f600/extractor/0.log" Apr 28 20:20:29.652958 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:29.652909 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-b85c69797-jx7z7_94647eb0-60e0-4a5e-a906-7035ffdc4738/manager/0.log" Apr 28 20:20:30.032798 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:30.032755 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-2hwb9_1bf14349-ef4e-45fd-bb5d-ce0795206852/manager/0.log" Apr 28 20:20:30.298228 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:30.298156 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-r9lxr/perf-node-gather-daemonset-hhlgn" Apr 28 20:20:30.537559 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:30.537499 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-5t9jq_71704a35-fdd3-4b64-a758-2b24c286270e/s3-tls-init-serving/0.log" Apr 28 20:20:30.588388 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:30.588305 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-6tn7v_0fe3fc8e-4148-4958-9bc0-c0b1fb3fa5fe/seaweedfs-tls-custom/0.log" Apr 28 20:20:30.611857 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:30.611829 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-fk69z_d08931bf-78f5-41dc-906f-62525abfa8ce/seaweedfs-tls-serving/0.log" Apr 28 20:20:36.042442 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:36.042411 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j682m_58213e63-9543-4438-bbbf-d242d52abc8f/kube-multus-additional-cni-plugins/0.log" Apr 28 20:20:36.061919 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:36.061892 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j682m_58213e63-9543-4438-bbbf-d242d52abc8f/egress-router-binary-copy/0.log" Apr 28 20:20:36.081037 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:36.081009 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j682m_58213e63-9543-4438-bbbf-d242d52abc8f/cni-plugins/0.log" Apr 28 20:20:36.104122 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:36.104098 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j682m_58213e63-9543-4438-bbbf-d242d52abc8f/bond-cni-plugin/0.log" Apr 28 20:20:36.124544 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:36.124524 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j682m_58213e63-9543-4438-bbbf-d242d52abc8f/routeoverride-cni/0.log" Apr 28 20:20:36.148700 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:36.148674 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j682m_58213e63-9543-4438-bbbf-d242d52abc8f/whereabouts-cni-bincopy/0.log" Apr 28 20:20:36.212247 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:36.212222 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j682m_58213e63-9543-4438-bbbf-d242d52abc8f/whereabouts-cni/0.log" Apr 28 20:20:36.505661 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:36.505635 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nlr5m_7e8df34b-a216-4c08-a88b-4c94b5d16b1c/kube-multus/0.log" Apr 28 20:20:36.615349 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:36.615317 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-txdd9_2c344b2c-cf71-45b1-9143-e86be8d1b7b5/network-metrics-daemon/0.log" Apr 28 20:20:36.639706 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:36.639682 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-txdd9_2c344b2c-cf71-45b1-9143-e86be8d1b7b5/kube-rbac-proxy/0.log" Apr 28 20:20:37.453242 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:37.453214 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-controller/0.log" Apr 28 20:20:37.473136 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:37.473114 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/0.log" Apr 28 20:20:37.491877 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:37.491854 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovn-acl-logging/1.log" Apr 28 20:20:37.508009 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:37.507984 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/kube-rbac-proxy-node/0.log" Apr 28 20:20:37.530966 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:37.530928 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/kube-rbac-proxy-ovn-metrics/0.log" Apr 28 20:20:37.551468 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:37.551448 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/northd/0.log" Apr 28 20:20:37.578080 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:37.578057 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/nbdb/0.log" Apr 28 20:20:37.600182 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:37.600168 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/sbdb/0.log" Apr 28 20:20:37.713145 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:37.713072 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-977nw_5ced465f-4a51-4441-b363-efac6c32deb0/ovnkube-controller/0.log" Apr 28 20:20:39.352833 ip-10-0-143-206 kubenswrapper[2539]: I0428 20:20:39.352802 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-wg74q_dff9f9ea-63cc-4089-bb7e-e9fcb292c695/network-check-target-container/0.log"