Apr 24 19:03:43.904635 ip-10-0-138-6 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 19:03:43.904651 ip-10-0-138-6 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 19:03:43.904661 ip-10-0-138-6 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 19:03:43.904996 ip-10-0-138-6 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 19:03:54.080746 ip-10-0-138-6 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 19:03:54.080763 ip-10-0-138-6 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 24c87026b1804ff79fa532841524faa1 -- Apr 24 19:06:02.767713 ip-10-0-138-6 systemd[1]: Starting Kubernetes Kubelet... Apr 24 19:06:03.197546 ip-10-0-138-6 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 19:06:03.197546 ip-10-0-138-6 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 19:06:03.197546 ip-10-0-138-6 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 19:06:03.197546 ip-10-0-138-6 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 19:06:03.197546 ip-10-0-138-6 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 19:06:03.199006 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.198915 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 19:06:03.201222 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201208 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:03.201257 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201224 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:03.201257 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201228 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:03.201257 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201231 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:03.201257 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201234 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:03.201257 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201236 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:03.201257 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201239 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:03.201257 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201242 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:03.201257 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201245 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:03.201257 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201247 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:03.201257 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201249 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:03.201257 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201252 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:03.201257 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201256 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:03.201257 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201258 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:03.201257 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201262 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:03.201618 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201265 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:03.201618 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201268 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:03.201618 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201270 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:03.201618 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201273 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:03.201618 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201275 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:03.201618 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201278 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:03.201618 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201288 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:03.201618 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201293 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:03.201618 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201296 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:03.201618 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201299 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:03.201618 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201301 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:03.201618 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201303 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:03.201618 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201306 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:03.201618 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201310 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:03.201618 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201312 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:03.201618 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201315 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:03.201618 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201317 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:03.201618 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201319 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:03.201618 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201322 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:03.201618 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201324 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:03.202110 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201327 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:03.202110 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201330 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:03.202110 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201334 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:03.202110 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201336 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:03.202110 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201338 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:03.202110 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201341 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:03.202110 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201343 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:03.202110 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201346 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:03.202110 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201348 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:03.202110 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201351 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:03.202110 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201353 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:03.202110 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201356 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:03.202110 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201358 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:03.202110 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201361 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:03.202110 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201364 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:03.202110 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201366 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:03.202110 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201369 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:03.202110 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201373 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:03.202110 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201377 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:03.202110 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201380 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:03.202631 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201383 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:03.202631 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201385 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:03.202631 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201388 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:03.202631 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201390 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:03.202631 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201393 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:03.202631 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201395 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:03.202631 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201397 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:03.202631 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201400 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:03.202631 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201402 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:03.202631 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201405 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:03.202631 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201408 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:03.202631 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201410 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:03.202631 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201413 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:03.202631 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201416 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:03.202631 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201419 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:03.202631 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201422 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:03.202631 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201424 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:03.202631 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201426 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:03.202631 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201442 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:03.202631 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201446 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:03.203103 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201450 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:03.203103 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201453 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:03.203103 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201455 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:03.203103 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201458 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:03.203103 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201460 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:03.203103 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201463 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:03.203103 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201466 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:03.203103 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201468 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:03.203103 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201471 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:03.203103 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201473 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:03.203103 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201476 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:03.203103 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201864 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:03.203103 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201869 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:03.203103 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201872 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:03.203103 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201874 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:03.203103 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201877 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:03.203103 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201880 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:03.203103 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201883 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:03.203103 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201885 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:03.203103 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201888 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:03.203643 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201891 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:03.203643 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201894 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:03.203643 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201897 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:03.203643 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201900 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:03.203643 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201902 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:03.203643 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201905 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:03.203643 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201913 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:03.203643 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201916 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:03.203643 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201919 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:03.203643 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201921 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:03.203643 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201924 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:03.203643 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201927 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:03.203643 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201929 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:03.203643 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201932 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:03.203643 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201934 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:03.203643 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201937 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:03.203643 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201939 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:03.203643 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201942 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:03.203643 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201944 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:03.203643 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201946 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:03.204171 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201949 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:03.204171 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201951 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:03.204171 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201954 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:03.204171 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201956 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:03.204171 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201959 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:03.204171 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201961 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:03.204171 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201964 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:03.204171 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201966 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:03.204171 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201968 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:03.204171 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201971 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:03.204171 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201974 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:03.204171 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201976 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:03.204171 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201980 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:03.204171 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201982 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:03.204171 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201985 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:03.204171 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201987 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:03.204171 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201990 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:03.204171 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201993 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:03.204171 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.201995 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:03.204171 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202003 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:03.204676 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202005 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:03.204676 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202008 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:03.204676 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202010 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:03.204676 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202014 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:03.204676 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202016 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:03.204676 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202019 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:03.204676 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202021 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:03.204676 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202023 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:03.204676 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202026 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:03.204676 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202028 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:03.204676 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202030 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:03.204676 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202033 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:03.204676 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202035 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:03.204676 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202038 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:03.204676 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202040 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:03.204676 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202042 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:03.204676 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202046 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:03.204676 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202050 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:03.204676 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202052 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:03.204676 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202055 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:03.205163 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202059 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:03.205163 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202063 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:03.205163 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202067 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:03.205163 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202069 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:03.205163 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202072 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:03.205163 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202075 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:03.205163 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202078 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:03.205163 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202080 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:03.205163 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202083 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:03.205163 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202085 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:03.205163 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202088 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:03.205163 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202090 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:03.205163 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202098 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:03.205163 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202106 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:03.205163 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202110 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:03.205163 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202112 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:03.205163 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.202115 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:03.205163 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203359 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 19:06:03.205163 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203367 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203376 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203380 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203384 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203387 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203391 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203395 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203399 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203402 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203405 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203408 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203411 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203414 2573 flags.go:64] FLAG: --cgroup-root="" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203417 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203420 2573 flags.go:64] FLAG: --client-ca-file="" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203423 2573 flags.go:64] FLAG: --cloud-config="" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203426 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203442 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203450 2573 flags.go:64] FLAG: --cluster-domain="" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203453 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203456 2573 flags.go:64] FLAG: --config-dir="" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203459 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203462 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203466 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 19:06:03.205658 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203469 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203472 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203475 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203484 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203487 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203490 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203493 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203496 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203500 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203503 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203506 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203509 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203512 2573 flags.go:64] FLAG: --enable-server="true" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203515 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203521 2573 flags.go:64] FLAG: --event-burst="100" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203524 2573 flags.go:64] FLAG: --event-qps="50" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203527 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203530 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203533 2573 flags.go:64] FLAG: --eviction-hard="" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203536 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203539 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203542 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203545 2573 flags.go:64] FLAG: --eviction-soft="" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203548 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203551 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 19:06:03.206247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203554 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203557 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203560 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203563 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203566 2573 flags.go:64] FLAG: --feature-gates="" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203570 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203572 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203575 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203579 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203582 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203585 2573 flags.go:64] FLAG: --help="false" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203593 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-138-6.ec2.internal" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203596 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203599 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203602 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203606 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203609 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203612 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203614 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203618 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203621 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203623 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203626 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203629 2573 flags.go:64] FLAG: --kube-reserved="" Apr 24 19:06:03.206863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203632 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203634 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203637 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203640 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203646 2573 flags.go:64] FLAG: --lock-file="" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203649 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203652 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203655 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203660 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203664 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203666 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203669 2573 flags.go:64] FLAG: --logging-format="text" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203672 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203675 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203678 2573 flags.go:64] FLAG: --manifest-url="" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203681 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203685 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203688 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203692 2573 flags.go:64] FLAG: --max-pods="110" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203695 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203704 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203707 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203711 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203713 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203716 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 19:06:03.207457 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203719 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203726 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203729 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203732 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203735 2573 flags.go:64] FLAG: --pod-cidr="" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203738 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203743 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203746 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203749 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203752 2573 flags.go:64] FLAG: --port="10250" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203755 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203757 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0db16ddf2bd7822ce" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203762 2573 flags.go:64] FLAG: --qos-reserved="" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203765 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203767 2573 flags.go:64] FLAG: --register-node="true" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203771 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203774 2573 flags.go:64] FLAG: --register-with-taints="" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203777 2573 flags.go:64] FLAG: --registry-burst="10" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203780 2573 flags.go:64] FLAG: --registry-qps="5" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203783 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203786 2573 flags.go:64] FLAG: --reserved-memory="" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203789 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203792 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203795 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203798 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 19:06:03.208079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203800 2573 flags.go:64] FLAG: --runonce="false" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203803 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203806 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203815 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203818 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203821 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203823 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203827 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203829 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203832 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203835 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203838 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203841 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203843 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203846 2573 flags.go:64] FLAG: --system-cgroups="" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203849 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203854 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203857 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203859 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203867 2573 flags.go:64] FLAG: --tls-min-version="" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203870 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203873 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203876 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203879 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203882 2573 flags.go:64] FLAG: --v="2" Apr 24 19:06:03.208700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203886 2573 flags.go:64] FLAG: --version="false" Apr 24 19:06:03.209300 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203890 2573 flags.go:64] FLAG: --vmodule="" Apr 24 19:06:03.209300 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203894 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 19:06:03.209300 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.203898 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 19:06:03.209300 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204020 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:03.209300 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204024 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:03.209300 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204027 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:03.209300 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204030 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:03.209300 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204033 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:03.209300 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204036 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:03.209300 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204039 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:03.209300 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204047 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:03.209300 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204050 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:03.209300 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204053 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:03.209300 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204055 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:03.209300 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204058 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:03.209300 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204060 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:03.209300 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204063 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:03.209300 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204065 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:03.209300 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204067 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:03.209300 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204070 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:03.209878 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204072 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:03.209878 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204075 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:03.209878 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204077 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:03.209878 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204079 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:03.209878 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204081 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:03.209878 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204085 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:03.209878 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204088 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:03.209878 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204090 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:03.209878 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204093 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:03.209878 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204095 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:03.209878 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204098 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:03.209878 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204101 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:03.209878 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204103 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:03.209878 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204106 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:03.209878 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204108 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:03.209878 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204111 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:03.209878 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204114 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:03.209878 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204116 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:03.209878 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204119 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:03.209878 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204121 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:03.210410 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204124 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:03.210410 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204127 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:03.210410 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204131 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:03.210410 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204139 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:03.210410 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204142 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:03.210410 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204144 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:03.210410 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204147 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:03.210410 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204149 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:03.210410 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204152 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:03.210410 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204154 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:03.210410 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204157 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:03.210410 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204159 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:03.210410 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204162 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:03.210410 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204164 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:03.210410 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204167 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:03.210410 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204169 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:03.210410 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204171 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:03.210410 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204175 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:03.210410 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204178 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:03.210410 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204180 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:03.211213 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204184 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:03.211213 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204192 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:03.211213 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204195 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:03.211213 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204197 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:03.211213 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204199 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:03.211213 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204202 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:03.211213 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204204 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:03.211213 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204207 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:03.211213 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204209 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:03.211213 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204211 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:03.211213 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204214 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:03.211213 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204216 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:03.211213 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204219 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:03.211213 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204222 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:03.211213 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204224 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:03.211213 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204226 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:03.211213 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204234 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:03.211213 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204237 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:03.211213 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204239 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:03.211213 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204241 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:03.211898 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204244 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:03.211898 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204246 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:03.211898 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204249 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:03.211898 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204251 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:03.211898 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204257 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:03.211898 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204260 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:03.211898 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204262 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:03.211898 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204265 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:03.211898 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.204267 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:03.211898 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.204277 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 19:06:03.212157 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.212077 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 19:06:03.212157 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.212093 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 19:06:03.212157 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212142 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:03.212157 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212147 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:03.212157 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212150 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:03.212157 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212153 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:03.212157 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212155 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:03.212157 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212158 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:03.212157 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212161 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:03.212398 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212164 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:03.212398 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212167 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:03.212398 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212169 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:03.212398 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212172 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:03.212398 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212174 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:03.212398 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212177 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:03.212398 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212179 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:03.212398 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212182 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:03.212398 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212185 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:03.212398 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212187 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:03.212398 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212189 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:03.212398 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212192 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:03.212398 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212196 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:03.212398 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212200 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:03.212398 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212203 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:03.212398 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212206 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:03.212398 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212209 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:03.212398 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212211 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:03.212398 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212214 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:03.212881 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212217 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:03.212881 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212219 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:03.212881 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212222 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:03.212881 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212225 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:03.212881 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212228 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:03.212881 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212232 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:03.212881 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212235 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:03.212881 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212237 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:03.212881 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212240 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:03.212881 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212242 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:03.212881 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212244 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:03.212881 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212247 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:03.212881 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212249 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:03.212881 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212252 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:03.212881 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212255 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:03.212881 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212258 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:03.212881 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212260 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:03.212881 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212263 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:03.212881 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212266 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:03.213370 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212268 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:03.213370 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212271 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:03.213370 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212273 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:03.213370 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212276 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:03.213370 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212278 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:03.213370 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212281 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:03.213370 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212283 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:03.213370 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212286 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:03.213370 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212289 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:03.213370 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212291 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:03.213370 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212294 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:03.213370 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212296 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:03.213370 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212299 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:03.213370 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212301 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:03.213370 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212304 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:03.213370 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212307 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:03.213370 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212309 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:03.213370 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212311 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:03.213370 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212314 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:03.213370 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212316 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:03.213929 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212319 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:03.213929 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212322 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:03.213929 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212324 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:03.213929 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212327 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:03.213929 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212329 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:03.213929 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212331 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:03.213929 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212334 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:03.213929 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212337 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:03.213929 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212340 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:03.213929 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212342 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:03.213929 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212345 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:03.213929 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212348 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:03.213929 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212351 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:03.213929 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212353 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:03.213929 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212355 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:03.213929 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212358 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:03.213929 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212360 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:03.213929 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212363 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:03.213929 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212365 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:03.213929 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212368 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:03.214416 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212370 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:03.214416 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.212375 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 19:06:03.214416 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212478 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:03.214416 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212483 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:03.214416 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212486 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:03.214416 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212488 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:03.214416 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212491 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:03.214416 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212494 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:03.214416 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212496 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:03.214416 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212499 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:03.214416 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212502 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:03.214416 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212505 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:03.214416 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212507 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:03.214416 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212510 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:03.214416 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212512 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:03.214416 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212515 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:03.214814 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212517 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:03.214814 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212519 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:03.214814 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212522 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:03.214814 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212525 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:03.214814 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212527 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:03.214814 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212530 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:03.214814 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212532 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:03.214814 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212535 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:03.214814 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212538 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:03.214814 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212540 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:03.214814 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212543 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:03.214814 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212545 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:03.214814 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212547 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:03.214814 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212550 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:03.214814 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212552 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:03.214814 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212555 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:03.214814 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212557 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:03.214814 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212559 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:03.214814 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212562 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:03.214814 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212564 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:03.215305 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212568 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:03.215305 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212572 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:03.215305 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212575 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:03.215305 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212578 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:03.215305 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212580 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:03.215305 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212583 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:03.215305 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212586 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:03.215305 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212588 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:03.215305 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212590 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:03.215305 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212593 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:03.215305 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212596 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:03.215305 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212598 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:03.215305 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212600 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:03.215305 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212603 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:03.215305 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212606 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:03.215305 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212608 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:03.215305 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212611 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:03.215305 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212613 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:03.215305 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212616 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:03.215856 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212618 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:03.215856 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212621 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:03.215856 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212624 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:03.215856 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212626 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:03.215856 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212628 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:03.215856 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212631 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:03.215856 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212633 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:03.215856 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212635 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:03.215856 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212638 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:03.215856 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212640 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:03.215856 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212642 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:03.215856 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212645 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:03.215856 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212647 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:03.215856 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212649 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:03.215856 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212653 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:03.215856 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212656 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:03.215856 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212659 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:03.215856 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212661 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:03.215856 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212664 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:03.216319 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212666 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:03.216319 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212669 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:03.216319 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212671 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:03.216319 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212673 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:03.216319 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212676 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:03.216319 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212678 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:03.216319 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212681 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:03.216319 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212683 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:03.216319 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212686 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:03.216319 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212688 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:03.216319 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212691 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:03.216319 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212693 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:03.216319 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212695 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:03.216319 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:03.212698 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:03.216319 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.212703 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 19:06:03.216758 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.213298 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 19:06:03.216758 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.216108 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 19:06:03.217029 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.217017 2573 server.go:1019] "Starting client certificate rotation" Apr 24 19:06:03.217135 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.217117 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 19:06:03.217173 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.217159 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 19:06:03.240527 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.240504 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 19:06:03.243046 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.243012 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 19:06:03.257409 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.257392 2573 log.go:25] "Validated CRI v1 runtime API" Apr 24 19:06:03.264218 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.264204 2573 log.go:25] "Validated CRI v1 image API" Apr 24 19:06:03.266660 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.266644 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 19:06:03.269746 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.269726 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 19:06:03.270816 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.270786 2573 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 7a26240a-aed9-4392-9b5a-6ad69477ad18:/dev/nvme0n1p3 ca74c020-4fe8-430d-88f9-219170adc9a3:/dev/nvme0n1p4] Apr 24 19:06:03.270891 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.270815 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 19:06:03.277443 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.277314 2573 manager.go:217] Machine: {Timestamp:2026-04-24 19:06:03.276160254 +0000 UTC m=+0.393500633 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101983 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec26d04e299797970d7047abbc72de42 SystemUUID:ec26d04e-2997-9797-0d70-47abbc72de42 BootID:24c87026-b180-4ff7-9fa5-32841524faa1 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:72:6d:95:83:73 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:72:6d:95:83:73 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ce:75:84:e5:3e:fb Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 19:06:03.277443 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.277416 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 19:06:03.277570 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.277546 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 19:06:03.277915 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.277895 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 19:06:03.278048 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.277917 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-6.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 19:06:03.278090 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.278058 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 19:06:03.278090 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.278066 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 19:06:03.278090 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.278079 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 19:06:03.278924 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.278914 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 19:06:03.280011 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.280001 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 24 19:06:03.280118 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.280110 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 19:06:03.282480 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.282469 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 24 19:06:03.282527 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.282484 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 19:06:03.282527 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.282495 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 19:06:03.282527 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.282504 2573 kubelet.go:397] "Adding apiserver pod source" Apr 24 19:06:03.282527 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.282512 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 19:06:03.283577 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.283562 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 19:06:03.283620 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.283590 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 19:06:03.287242 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.287222 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 19:06:03.288516 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.288500 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 19:06:03.289948 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.289935 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 19:06:03.289990 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.289952 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 19:06:03.289990 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.289960 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 19:06:03.289990 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.289965 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 19:06:03.289990 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.289972 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 19:06:03.289990 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.289978 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 19:06:03.289990 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.289983 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 19:06:03.289990 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.289988 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 19:06:03.290176 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.289995 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 19:06:03.290176 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.290002 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 19:06:03.290176 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.290016 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 19:06:03.290176 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.290025 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 19:06:03.290756 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.290746 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 19:06:03.290809 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.290759 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 19:06:03.293445 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.293411 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-6.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 19:06:03.293840 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:03.293811 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-6.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 19:06:03.293891 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:03.293811 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 19:06:03.294306 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.294289 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qdmc8" Apr 24 19:06:03.294342 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.294312 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 19:06:03.294342 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.294337 2573 server.go:1295] "Started kubelet" Apr 24 19:06:03.294947 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.294903 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 19:06:03.295068 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.295043 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 19:06:03.295386 ip-10-0-138-6 systemd[1]: Started Kubernetes Kubelet. Apr 24 19:06:03.295896 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.295573 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 19:06:03.296617 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.296602 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 19:06:03.298411 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.298397 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 24 19:06:03.300584 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.300565 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 19:06:03.301136 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.301116 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 19:06:03.301930 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.301903 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 19:06:03.301930 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.301914 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 19:06:03.302045 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.301945 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 19:06:03.302045 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.302015 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 24 19:06:03.302045 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.302026 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 24 19:06:03.302169 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:03.302092 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-6.ec2.internal\" not found" Apr 24 19:06:03.305494 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.303534 2573 factory.go:55] Registering systemd factory Apr 24 19:06:03.305494 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.303602 2573 factory.go:223] Registration of the systemd container factory successfully Apr 24 19:06:03.305494 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.304997 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qdmc8" Apr 24 19:06:03.305494 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.305261 2573 factory.go:153] Registering CRI-O factory Apr 24 19:06:03.305494 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.305277 2573 factory.go:223] Registration of the crio container factory successfully Apr 24 19:06:03.305494 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.305388 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 19:06:03.305886 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.305869 2573 factory.go:103] Registering Raw factory Apr 24 19:06:03.305958 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.305898 2573 manager.go:1196] Started watching for new ooms in manager Apr 24 19:06:03.306956 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.306799 2573 manager.go:319] Starting recovery of all containers Apr 24 19:06:03.307930 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:03.307799 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-6.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 19:06:03.307930 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:03.307873 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 19:06:03.308153 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:03.308133 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 19:06:03.311965 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:03.307960 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-6.ec2.internal.18a9607045944199 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-6.ec2.internal,UID:ip-10-0-138-6.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-6.ec2.internal,},FirstTimestamp:2026-04-24 19:06:03.294319001 +0000 UTC m=+0.411659379,LastTimestamp:2026-04-24 19:06:03.294319001 +0000 UTC m=+0.411659379,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-6.ec2.internal,}" Apr 24 19:06:03.321268 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.321251 2573 manager.go:324] Recovery completed Apr 24 19:06:03.324961 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.324947 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:03.327066 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.327051 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-6.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:03.327125 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.327077 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-6.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:03.327125 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.327087 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-6.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:03.327574 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.327561 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 19:06:03.327622 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.327573 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 19:06:03.327622 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.327593 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 24 19:06:03.329978 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.329964 2573 policy_none.go:49] "None policy: Start" Apr 24 19:06:03.329978 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.329980 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 19:06:03.330081 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.329999 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 24 19:06:03.367072 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.367052 2573 manager.go:341] "Starting Device Plugin manager" Apr 24 19:06:03.375113 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:03.367121 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 19:06:03.375113 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.367132 2573 server.go:85] "Starting device plugin registration server" Apr 24 19:06:03.375113 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.367358 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 19:06:03.375113 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.367371 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 19:06:03.375113 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.367472 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 19:06:03.375113 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.367647 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 19:06:03.375113 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.367658 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 19:06:03.375113 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:03.368300 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 19:06:03.375113 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:03.368372 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-6.ec2.internal\" not found" Apr 24 19:06:03.375113 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.373600 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 19:06:03.375113 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.374893 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 19:06:03.375113 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.374912 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 19:06:03.375113 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.374927 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 19:06:03.375113 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.374932 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 19:06:03.375113 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:03.374960 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 19:06:03.378637 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.378623 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:03.468058 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.467991 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:03.468989 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.468970 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-6.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:03.469045 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.469004 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-6.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:03.469045 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.469025 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-6.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:03.469106 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.469055 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-6.ec2.internal" Apr 24 19:06:03.475072 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.475053 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-6.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-6.ec2.internal"] Apr 24 19:06:03.475126 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.475118 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:03.476480 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.476467 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-6.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:03.476547 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.476498 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-6.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:03.476547 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.476510 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-6.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:03.477573 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.477557 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:03.477718 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.477703 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-6.ec2.internal" Apr 24 19:06:03.477792 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.477738 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:03.477792 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.477789 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-6.ec2.internal" Apr 24 19:06:03.477896 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:03.477809 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-6.ec2.internal\": node \"ip-10-0-138-6.ec2.internal\" not found" Apr 24 19:06:03.478254 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.478235 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-6.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:03.478336 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.478233 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-6.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:03.478336 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.478285 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-6.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:03.478336 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.478297 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-6.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:03.478336 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.478262 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-6.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:03.478483 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.478341 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-6.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:03.479329 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.479316 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-6.ec2.internal" Apr 24 19:06:03.479391 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.479342 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:03.480385 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.480370 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-6.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:03.480479 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.480405 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-6.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:03.480479 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.480415 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-6.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:03.500986 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:03.500965 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-6.ec2.internal\" not found" node="ip-10-0-138-6.ec2.internal" Apr 24 19:06:03.502973 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.502958 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d8ea1cce20b14f0d5ffac11d636ad6aa-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-6.ec2.internal\" (UID: \"d8ea1cce20b14f0d5ffac11d636ad6aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-6.ec2.internal" Apr 24 19:06:03.503044 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.502988 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d8ea1cce20b14f0d5ffac11d636ad6aa-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-6.ec2.internal\" (UID: \"d8ea1cce20b14f0d5ffac11d636ad6aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-6.ec2.internal" Apr 24 19:06:03.503044 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.503016 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/946221ebdc75fe55155252abed2eec40-config\") pod \"kube-apiserver-proxy-ip-10-0-138-6.ec2.internal\" (UID: \"946221ebdc75fe55155252abed2eec40\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-6.ec2.internal" Apr 24 19:06:03.504384 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:03.504370 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-6.ec2.internal\" not found" node="ip-10-0-138-6.ec2.internal" Apr 24 19:06:03.506322 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:03.506307 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-6.ec2.internal\" not found" Apr 24 19:06:03.603800 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.603772 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d8ea1cce20b14f0d5ffac11d636ad6aa-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-6.ec2.internal\" (UID: \"d8ea1cce20b14f0d5ffac11d636ad6aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-6.ec2.internal" Apr 24 19:06:03.603921 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.603808 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d8ea1cce20b14f0d5ffac11d636ad6aa-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-6.ec2.internal\" (UID: \"d8ea1cce20b14f0d5ffac11d636ad6aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-6.ec2.internal" Apr 24 19:06:03.603921 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.603845 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/946221ebdc75fe55155252abed2eec40-config\") pod \"kube-apiserver-proxy-ip-10-0-138-6.ec2.internal\" (UID: \"946221ebdc75fe55155252abed2eec40\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-6.ec2.internal" Apr 24 19:06:03.603921 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.603877 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/946221ebdc75fe55155252abed2eec40-config\") pod \"kube-apiserver-proxy-ip-10-0-138-6.ec2.internal\" (UID: \"946221ebdc75fe55155252abed2eec40\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-6.ec2.internal" Apr 24 19:06:03.603921 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.603887 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d8ea1cce20b14f0d5ffac11d636ad6aa-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-6.ec2.internal\" (UID: \"d8ea1cce20b14f0d5ffac11d636ad6aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-6.ec2.internal" Apr 24 19:06:03.603921 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.603877 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d8ea1cce20b14f0d5ffac11d636ad6aa-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-6.ec2.internal\" (UID: \"d8ea1cce20b14f0d5ffac11d636ad6aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-6.ec2.internal" Apr 24 19:06:03.606877 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:03.606861 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-6.ec2.internal\" not found" Apr 24 19:06:03.707377 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:03.707348 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-6.ec2.internal\" not found" Apr 24 19:06:03.803003 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.802935 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-6.ec2.internal" Apr 24 19:06:03.806551 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:03.806537 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-6.ec2.internal" Apr 24 19:06:03.807623 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:03.807601 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-6.ec2.internal\" not found" Apr 24 19:06:03.908335 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:03.908295 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-6.ec2.internal\" not found" Apr 24 19:06:04.008791 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:04.008759 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-6.ec2.internal\" not found" Apr 24 19:06:04.109300 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:04.109223 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-6.ec2.internal\" not found" Apr 24 19:06:04.209698 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:04.209671 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-6.ec2.internal\" not found" Apr 24 19:06:04.216829 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:04.216809 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 19:06:04.216992 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:04.216976 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 19:06:04.301115 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:04.301077 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 19:06:04.307937 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:04.307758 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 19:01:03 +0000 UTC" deadline="2028-01-12 19:20:18.063780146 +0000 UTC" Apr 24 19:06:04.307937 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:04.307791 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15072h14m13.75599262s" Apr 24 19:06:04.310502 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:04.310482 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-6.ec2.internal\" not found" Apr 24 19:06:04.315333 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:04.315312 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 19:06:04.316687 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:04.316660 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod946221ebdc75fe55155252abed2eec40.slice/crio-f56f43257f668e4d97089d2016e378c93c86e58a0e8b70cc6395863b87e2e82b WatchSource:0}: Error finding container f56f43257f668e4d97089d2016e378c93c86e58a0e8b70cc6395863b87e2e82b: Status 404 returned error can't find the container with id f56f43257f668e4d97089d2016e378c93c86e58a0e8b70cc6395863b87e2e82b Apr 24 19:06:04.316926 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:04.316909 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8ea1cce20b14f0d5ffac11d636ad6aa.slice/crio-d0cc6d3ddcba08f7208a0fc84719383f026f01eb00c72c70fa35fe16b7afb103 WatchSource:0}: Error finding container d0cc6d3ddcba08f7208a0fc84719383f026f01eb00c72c70fa35fe16b7afb103: Status 404 returned error can't find the container with id d0cc6d3ddcba08f7208a0fc84719383f026f01eb00c72c70fa35fe16b7afb103 Apr 24 19:06:04.320318 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:04.320302 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:06:04.334318 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:04.334297 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-v9xr8" Apr 24 19:06:04.340542 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:04.340525 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-v9xr8" Apr 24 19:06:04.377966 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:04.377877 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-6.ec2.internal" event={"ID":"946221ebdc75fe55155252abed2eec40","Type":"ContainerStarted","Data":"f56f43257f668e4d97089d2016e378c93c86e58a0e8b70cc6395863b87e2e82b"} Apr 24 19:06:04.378807 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:04.378786 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-6.ec2.internal" event={"ID":"d8ea1cce20b14f0d5ffac11d636ad6aa","Type":"ContainerStarted","Data":"d0cc6d3ddcba08f7208a0fc84719383f026f01eb00c72c70fa35fe16b7afb103"} Apr 24 19:06:04.411242 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:04.411221 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-6.ec2.internal\" not found" Apr 24 19:06:04.511719 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:04.511678 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-6.ec2.internal\" not found" Apr 24 19:06:04.528406 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:04.528387 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:04.599108 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:04.599083 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:04.601345 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:04.601332 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-6.ec2.internal" Apr 24 19:06:04.617121 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:04.617104 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 19:06:04.618020 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:04.618008 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-6.ec2.internal" Apr 24 19:06:04.637561 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:04.637515 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 19:06:04.648446 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:04.648412 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:05.150908 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.150876 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:05.283498 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.283466 2573 apiserver.go:52] "Watching apiserver" Apr 24 19:06:05.289965 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.289939 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 19:06:05.291769 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.291717 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-nrb24","openshift-image-registry/node-ca-ntjd2","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-6.ec2.internal","openshift-multus/multus-additional-cni-plugins-t4krc","openshift-multus/network-metrics-daemon-tf94j","openshift-network-operator/iptables-alerter-8qp2b","openshift-ovn-kubernetes/ovnkube-node-wt2vz","kube-system/konnectivity-agent-zz6t5","kube-system/kube-apiserver-proxy-ip-10-0-138-6.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh","openshift-cluster-node-tuning-operator/tuned-6wv5x","openshift-multus/multus-lbrbg"] Apr 24 19:06:05.294838 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.294820 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:05.294924 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:05.294896 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nrb24" podUID="f5236623-3273-4733-a194-9bfd58303272" Apr 24 19:06:05.295880 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.295859 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ntjd2" Apr 24 19:06:05.295983 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.295965 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.296990 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.296967 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:05.297084 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:05.297044 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tf94j" podUID="b0872aa7-303f-4052-9d68-dd136609293b" Apr 24 19:06:05.298164 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.298143 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8qp2b" Apr 24 19:06:05.299329 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.298768 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 19:06:05.299329 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.298863 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 19:06:05.299329 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.298890 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 19:06:05.299329 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.299062 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 19:06:05.299329 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.299088 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 19:06:05.299329 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.299095 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-824w8\"" Apr 24 19:06:05.299329 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.299194 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-7slfx\"" Apr 24 19:06:05.299329 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.299211 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 19:06:05.299775 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.299453 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 19:06:05.299775 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.299543 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.299775 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.299679 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 19:06:05.300698 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.300676 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-dcm94\"" Apr 24 19:06:05.300698 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.300694 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 19:06:05.300698 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.300705 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 19:06:05.300973 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.300956 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:06:05.302016 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.301756 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zz6t5" Apr 24 19:06:05.302016 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.301800 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 19:06:05.302016 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.301819 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 19:06:05.302205 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.302181 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 19:06:05.302205 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.302192 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 19:06:05.302322 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.302242 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-jgnvn\"" Apr 24 19:06:05.302490 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.302474 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 19:06:05.302805 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.302788 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 19:06:05.303200 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.303181 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.305036 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.304870 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 19:06:05.305099 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.305090 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 19:06:05.305447 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.304705 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.305771 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.305745 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-l7ql5\"" Apr 24 19:06:05.306233 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.306216 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 19:06:05.306341 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.306318 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 19:06:05.306418 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.306381 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-msb7g\"" Apr 24 19:06:05.306534 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.306517 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 19:06:05.309124 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.308726 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:06:05.309124 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.308924 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.309477 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.309407 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-76jbw\"" Apr 24 19:06:05.309560 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.309514 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 19:06:05.311128 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311112 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-lfjkm\"" Apr 24 19:06:05.311385 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311368 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 19:06:05.311510 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311477 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6b161493-342d-489a-a1b5-2d34fb7236d6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.311510 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311505 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zztmz\" (UniqueName: \"kubernetes.io/projected/b0872aa7-303f-4052-9d68-dd136609293b-kube-api-access-zztmz\") pod \"network-metrics-daemon-tf94j\" (UID: \"b0872aa7-303f-4052-9d68-dd136609293b\") " pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:05.311621 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311523 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-host-kubelet\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.311621 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311539 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-ovn-node-metrics-cert\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.311621 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311555 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/98fd67e7-d173-4139-9cd2-a4b311033089-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c47zh\" (UID: \"98fd67e7-d173-4139-9cd2-a4b311033089\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.311621 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311576 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/98fd67e7-d173-4139-9cd2-a4b311033089-etc-selinux\") pod \"aws-ebs-csi-driver-node-c47zh\" (UID: \"98fd67e7-d173-4139-9cd2-a4b311033089\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.311621 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311600 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ca30f8c7-a373-4425-93f6-cfa4c4634150-agent-certs\") pod \"konnectivity-agent-zz6t5\" (UID: \"ca30f8c7-a373-4425-93f6-cfa4c4634150\") " pod="kube-system/konnectivity-agent-zz6t5" Apr 24 19:06:05.311621 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311616 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-node-log\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.311911 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311637 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-log-socket\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.311911 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311660 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-host-run-ovn-kubernetes\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.311911 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311676 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/98fd67e7-d173-4139-9cd2-a4b311033089-sys-fs\") pod \"aws-ebs-csi-driver-node-c47zh\" (UID: \"98fd67e7-d173-4139-9cd2-a4b311033089\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.311911 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311704 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ca30f8c7-a373-4425-93f6-cfa4c4634150-konnectivity-ca\") pod \"konnectivity-agent-zz6t5\" (UID: \"ca30f8c7-a373-4425-93f6-cfa4c4634150\") " pod="kube-system/konnectivity-agent-zz6t5" Apr 24 19:06:05.311911 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311733 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddg5p\" (UniqueName: \"kubernetes.io/projected/f5236623-3273-4733-a194-9bfd58303272-kube-api-access-ddg5p\") pod \"network-check-target-nrb24\" (UID: \"f5236623-3273-4733-a194-9bfd58303272\") " pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:05.311911 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311762 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-var-lib-openvswitch\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.311911 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311781 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zxht\" (UniqueName: \"kubernetes.io/projected/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-kube-api-access-8zxht\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.311911 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311822 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/98fd67e7-d173-4139-9cd2-a4b311033089-device-dir\") pod \"aws-ebs-csi-driver-node-c47zh\" (UID: \"98fd67e7-d173-4139-9cd2-a4b311033089\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.311911 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311845 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-run-systemd\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.311911 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311863 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e5bcfed1-92e6-4764-9896-4a9fc77aaef9-host\") pod \"node-ca-ntjd2\" (UID: \"e5bcfed1-92e6-4764-9896-4a9fc77aaef9\") " pod="openshift-image-registry/node-ca-ntjd2" Apr 24 19:06:05.312378 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311916 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6b161493-342d-489a-a1b5-2d34fb7236d6-os-release\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.312378 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311953 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6b161493-342d-489a-a1b5-2d34fb7236d6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.312378 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311976 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs\") pod \"network-metrics-daemon-tf94j\" (UID: \"b0872aa7-303f-4052-9d68-dd136609293b\") " pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:05.312378 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.311996 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-host-run-netns\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.312378 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312013 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-host-cni-netd\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.312378 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312032 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-ovnkube-script-lib\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.312378 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312047 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b161493-342d-489a-a1b5-2d34fb7236d6-system-cni-dir\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.312378 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312061 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6b161493-342d-489a-a1b5-2d34fb7236d6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.312378 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312080 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-run-ovn\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.312378 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312110 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/98fd67e7-d173-4139-9cd2-a4b311033089-socket-dir\") pod \"aws-ebs-csi-driver-node-c47zh\" (UID: \"98fd67e7-d173-4139-9cd2-a4b311033089\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.312378 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312134 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpr7j\" (UniqueName: \"kubernetes.io/projected/98fd67e7-d173-4139-9cd2-a4b311033089-kube-api-access-qpr7j\") pod \"aws-ebs-csi-driver-node-c47zh\" (UID: \"98fd67e7-d173-4139-9cd2-a4b311033089\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.312378 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312159 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0e855347-f4fa-493a-b42a-57f880bfc25d-iptables-alerter-script\") pod \"iptables-alerter-8qp2b\" (UID: \"0e855347-f4fa-493a-b42a-57f880bfc25d\") " pod="openshift-network-operator/iptables-alerter-8qp2b" Apr 24 19:06:05.312378 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312178 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e855347-f4fa-493a-b42a-57f880bfc25d-host-slash\") pod \"iptables-alerter-8qp2b\" (UID: \"0e855347-f4fa-493a-b42a-57f880bfc25d\") " pod="openshift-network-operator/iptables-alerter-8qp2b" Apr 24 19:06:05.312378 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312191 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e5bcfed1-92e6-4764-9896-4a9fc77aaef9-serviceca\") pod \"node-ca-ntjd2\" (UID: \"e5bcfed1-92e6-4764-9896-4a9fc77aaef9\") " pod="openshift-image-registry/node-ca-ntjd2" Apr 24 19:06:05.312378 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312204 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-run-openvswitch\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.312378 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312217 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-env-overrides\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.313154 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312243 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r5lz\" (UniqueName: \"kubernetes.io/projected/0e855347-f4fa-493a-b42a-57f880bfc25d-kube-api-access-2r5lz\") pod \"iptables-alerter-8qp2b\" (UID: \"0e855347-f4fa-493a-b42a-57f880bfc25d\") " pod="openshift-network-operator/iptables-alerter-8qp2b" Apr 24 19:06:05.313154 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312283 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6b161493-342d-489a-a1b5-2d34fb7236d6-cni-binary-copy\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.313154 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312308 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6mn5\" (UniqueName: \"kubernetes.io/projected/6b161493-342d-489a-a1b5-2d34fb7236d6-kube-api-access-h6mn5\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.313154 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312340 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/98fd67e7-d173-4139-9cd2-a4b311033089-registration-dir\") pod \"aws-ebs-csi-driver-node-c47zh\" (UID: \"98fd67e7-d173-4139-9cd2-a4b311033089\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.313154 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312363 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6vbl\" (UniqueName: \"kubernetes.io/projected/e5bcfed1-92e6-4764-9896-4a9fc77aaef9-kube-api-access-g6vbl\") pod \"node-ca-ntjd2\" (UID: \"e5bcfed1-92e6-4764-9896-4a9fc77aaef9\") " pod="openshift-image-registry/node-ca-ntjd2" Apr 24 19:06:05.313154 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312424 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6b161493-342d-489a-a1b5-2d34fb7236d6-cnibin\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.313154 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312497 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-systemd-units\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.313154 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312531 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-host-slash\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.313154 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312563 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-etc-openvswitch\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.313154 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312585 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-host-cni-bin\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.313154 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312615 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.313154 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.312631 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-ovnkube-config\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.341271 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.341233 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 19:01:04 +0000 UTC" deadline="2027-12-07 17:18:32.639511992 +0000 UTC" Apr 24 19:06:05.341355 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.341272 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14206h12m27.298251224s" Apr 24 19:06:05.402678 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.402615 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 19:06:05.413453 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413406 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f546568-c9ad-4518-a7a9-893e659002a9-tmp\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.413581 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413497 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6b161493-342d-489a-a1b5-2d34fb7236d6-cnibin\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.413581 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413524 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-host-slash\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.413581 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413549 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.413581 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413571 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/98fd67e7-d173-4139-9cd2-a4b311033089-etc-selinux\") pod \"aws-ebs-csi-driver-node-c47zh\" (UID: \"98fd67e7-d173-4139-9cd2-a4b311033089\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.413791 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413596 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-host-var-lib-cni-bin\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.413791 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413619 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cfb7edc3-113a-4b12-83d1-66356304b80c-multus-daemon-config\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.413791 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413627 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6b161493-342d-489a-a1b5-2d34fb7236d6-cnibin\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.413791 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413645 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-etc-kubernetes\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.413791 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413676 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6b161493-342d-489a-a1b5-2d34fb7236d6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.413791 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413705 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/98fd67e7-d173-4139-9cd2-a4b311033089-etc-selinux\") pod \"aws-ebs-csi-driver-node-c47zh\" (UID: \"98fd67e7-d173-4139-9cd2-a4b311033089\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.413791 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413717 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-host-kubelet\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.413791 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413739 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-host-slash\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.413791 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413756 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-ovn-node-metrics-cert\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.413791 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413769 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.414257 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413797 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/98fd67e7-d173-4139-9cd2-a4b311033089-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c47zh\" (UID: \"98fd67e7-d173-4139-9cd2-a4b311033089\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.414257 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413822 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-log-socket\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.414257 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413847 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-host-run-ovn-kubernetes\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.414257 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413872 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-multus-conf-dir\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.414257 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413898 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-host-run-multus-certs\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.414257 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413934 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ca30f8c7-a373-4425-93f6-cfa4c4634150-konnectivity-ca\") pod \"konnectivity-agent-zz6t5\" (UID: \"ca30f8c7-a373-4425-93f6-cfa4c4634150\") " pod="kube-system/konnectivity-agent-zz6t5" Apr 24 19:06:05.414257 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413959 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-var-lib-openvswitch\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.414257 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.413986 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zxht\" (UniqueName: \"kubernetes.io/projected/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-kube-api-access-8zxht\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.414257 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414021 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/98fd67e7-d173-4139-9cd2-a4b311033089-device-dir\") pod \"aws-ebs-csi-driver-node-c47zh\" (UID: \"98fd67e7-d173-4139-9cd2-a4b311033089\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.414257 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414049 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-host-var-lib-kubelet\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.414257 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414073 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8j65\" (UniqueName: \"kubernetes.io/projected/cfb7edc3-113a-4b12-83d1-66356304b80c-kube-api-access-j8j65\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.414257 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414097 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e5bcfed1-92e6-4764-9896-4a9fc77aaef9-host\") pod \"node-ca-ntjd2\" (UID: \"e5bcfed1-92e6-4764-9896-4a9fc77aaef9\") " pod="openshift-image-registry/node-ca-ntjd2" Apr 24 19:06:05.414257 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414125 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-host-run-netns\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.414257 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414151 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cfb7edc3-113a-4b12-83d1-66356304b80c-cni-binary-copy\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.414257 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414199 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-etc-systemd\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.414257 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414223 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-sys\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.414257 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414226 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e5bcfed1-92e6-4764-9896-4a9fc77aaef9-host\") pod \"node-ca-ntjd2\" (UID: \"e5bcfed1-92e6-4764-9896-4a9fc77aaef9\") " pod="openshift-image-registry/node-ca-ntjd2" Apr 24 19:06:05.415037 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414239 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-var-lib-openvswitch\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.415037 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414250 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b161493-342d-489a-a1b5-2d34fb7236d6-system-cni-dir\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.415037 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414276 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6b161493-342d-489a-a1b5-2d34fb7236d6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.415037 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414303 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-multus-socket-dir-parent\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.415037 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414327 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-host-run-k8s-cni-cncf-io\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.415037 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414345 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6b161493-342d-489a-a1b5-2d34fb7236d6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.415037 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414357 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-run-openvswitch\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.415037 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414382 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-env-overrides\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.415037 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414409 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-multus-cni-dir\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.415037 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414454 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-etc-kubernetes\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.415037 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414478 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-run\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.415037 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414494 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-host-run-netns\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.415037 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414521 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ghkg\" (UniqueName: \"kubernetes.io/projected/7f546568-c9ad-4518-a7a9-893e659002a9-kube-api-access-5ghkg\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.415037 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414529 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 19:06:05.415037 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414586 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/98fd67e7-d173-4139-9cd2-a4b311033089-device-dir\") pod \"aws-ebs-csi-driver-node-c47zh\" (UID: \"98fd67e7-d173-4139-9cd2-a4b311033089\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.415037 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414643 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-log-socket\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.415037 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414696 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/98fd67e7-d173-4139-9cd2-a4b311033089-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c47zh\" (UID: \"98fd67e7-d173-4139-9cd2-a4b311033089\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.415767 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414702 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-host-kubelet\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.415767 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414771 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-host-run-ovn-kubernetes\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.415767 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414454 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b161493-342d-489a-a1b5-2d34fb7236d6-system-cni-dir\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.415767 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414952 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6b161493-342d-489a-a1b5-2d34fb7236d6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.415767 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.415013 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-run-openvswitch\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.415767 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.414550 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6b161493-342d-489a-a1b5-2d34fb7236d6-cni-binary-copy\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.415767 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.415046 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6b161493-342d-489a-a1b5-2d34fb7236d6-cni-binary-copy\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.415767 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.415064 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ca30f8c7-a373-4425-93f6-cfa4c4634150-konnectivity-ca\") pod \"konnectivity-agent-zz6t5\" (UID: \"ca30f8c7-a373-4425-93f6-cfa4c4634150\") " pod="kube-system/konnectivity-agent-zz6t5" Apr 24 19:06:05.415767 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.415070 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6mn5\" (UniqueName: \"kubernetes.io/projected/6b161493-342d-489a-a1b5-2d34fb7236d6-kube-api-access-h6mn5\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.415767 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.415114 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-env-overrides\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.415767 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.415193 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-os-release\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.415767 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.415232 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-cnibin\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.415767 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.415257 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6vbl\" (UniqueName: \"kubernetes.io/projected/e5bcfed1-92e6-4764-9896-4a9fc77aaef9-kube-api-access-g6vbl\") pod \"node-ca-ntjd2\" (UID: \"e5bcfed1-92e6-4764-9896-4a9fc77aaef9\") " pod="openshift-image-registry/node-ca-ntjd2" Apr 24 19:06:05.415767 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.415287 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-systemd-units\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.415767 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.415408 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-etc-openvswitch\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.415767 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.415419 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-systemd-units\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.415767 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.415495 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-etc-openvswitch\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.416448 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.415599 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-host-cni-bin\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.416448 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416060 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-host-cni-bin\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.416448 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416099 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-ovnkube-config\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.416448 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416129 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zztmz\" (UniqueName: \"kubernetes.io/projected/b0872aa7-303f-4052-9d68-dd136609293b-kube-api-access-zztmz\") pod \"network-metrics-daemon-tf94j\" (UID: \"b0872aa7-303f-4052-9d68-dd136609293b\") " pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:05.416448 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416154 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ca30f8c7-a373-4425-93f6-cfa4c4634150-agent-certs\") pod \"konnectivity-agent-zz6t5\" (UID: \"ca30f8c7-a373-4425-93f6-cfa4c4634150\") " pod="kube-system/konnectivity-agent-zz6t5" Apr 24 19:06:05.416448 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416180 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-node-log\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.416448 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416205 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/98fd67e7-d173-4139-9cd2-a4b311033089-sys-fs\") pod \"aws-ebs-csi-driver-node-c47zh\" (UID: \"98fd67e7-d173-4139-9cd2-a4b311033089\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.416448 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416230 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddg5p\" (UniqueName: \"kubernetes.io/projected/f5236623-3273-4733-a194-9bfd58303272-kube-api-access-ddg5p\") pod \"network-check-target-nrb24\" (UID: \"f5236623-3273-4733-a194-9bfd58303272\") " pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:05.416448 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416244 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-node-log\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.416448 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416257 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-run-systemd\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.416448 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416282 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6b161493-342d-489a-a1b5-2d34fb7236d6-os-release\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.416448 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416305 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/98fd67e7-d173-4139-9cd2-a4b311033089-sys-fs\") pod \"aws-ebs-csi-driver-node-c47zh\" (UID: \"98fd67e7-d173-4139-9cd2-a4b311033089\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.416448 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416347 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6b161493-342d-489a-a1b5-2d34fb7236d6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.416448 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416379 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs\") pod \"network-metrics-daemon-tf94j\" (UID: \"b0872aa7-303f-4052-9d68-dd136609293b\") " pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:05.416448 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416402 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-host-cni-netd\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.416448 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416424 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-ovnkube-script-lib\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.416448 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416426 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6b161493-342d-489a-a1b5-2d34fb7236d6-os-release\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.417170 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416464 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpr7j\" (UniqueName: \"kubernetes.io/projected/98fd67e7-d173-4139-9cd2-a4b311033089-kube-api-access-qpr7j\") pod \"aws-ebs-csi-driver-node-c47zh\" (UID: \"98fd67e7-d173-4139-9cd2-a4b311033089\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.417170 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416491 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-host-run-netns\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.417170 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416517 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-run-ovn\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.417170 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416539 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/98fd67e7-d173-4139-9cd2-a4b311033089-socket-dir\") pod \"aws-ebs-csi-driver-node-c47zh\" (UID: \"98fd67e7-d173-4139-9cd2-a4b311033089\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.417170 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416565 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-host-var-lib-cni-multus\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.417170 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416625 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-ovnkube-config\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.417170 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416637 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-run-ovn\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.417170 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:05.416642 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:05.417170 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416672 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-host-cni-netd\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.417170 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416349 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-run-systemd\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.417170 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416733 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/98fd67e7-d173-4139-9cd2-a4b311033089-socket-dir\") pod \"aws-ebs-csi-driver-node-c47zh\" (UID: \"98fd67e7-d173-4139-9cd2-a4b311033089\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.417170 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:05.416745 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs podName:b0872aa7-303f-4052-9d68-dd136609293b nodeName:}" failed. No retries permitted until 2026-04-24 19:06:05.916723994 +0000 UTC m=+3.034064361 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs") pod "network-metrics-daemon-tf94j" (UID: "b0872aa7-303f-4052-9d68-dd136609293b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:05.417170 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416790 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-etc-sysconfig\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.417170 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416819 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-etc-sysctl-conf\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.417170 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416847 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-lib-modules\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.417170 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416871 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-var-lib-kubelet\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.417170 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416895 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7f546568-c9ad-4518-a7a9-893e659002a9-etc-tuned\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.417908 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416898 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6b161493-342d-489a-a1b5-2d34fb7236d6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.417908 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416923 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0e855347-f4fa-493a-b42a-57f880bfc25d-iptables-alerter-script\") pod \"iptables-alerter-8qp2b\" (UID: \"0e855347-f4fa-493a-b42a-57f880bfc25d\") " pod="openshift-network-operator/iptables-alerter-8qp2b" Apr 24 19:06:05.417908 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416948 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e855347-f4fa-493a-b42a-57f880bfc25d-host-slash\") pod \"iptables-alerter-8qp2b\" (UID: \"0e855347-f4fa-493a-b42a-57f880bfc25d\") " pod="openshift-network-operator/iptables-alerter-8qp2b" Apr 24 19:06:05.417908 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.416975 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e5bcfed1-92e6-4764-9896-4a9fc77aaef9-serviceca\") pod \"node-ca-ntjd2\" (UID: \"e5bcfed1-92e6-4764-9896-4a9fc77aaef9\") " pod="openshift-image-registry/node-ca-ntjd2" Apr 24 19:06:05.417908 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.417005 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-hostroot\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.417908 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.417032 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e855347-f4fa-493a-b42a-57f880bfc25d-host-slash\") pod \"iptables-alerter-8qp2b\" (UID: \"0e855347-f4fa-493a-b42a-57f880bfc25d\") " pod="openshift-network-operator/iptables-alerter-8qp2b" Apr 24 19:06:05.417908 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.417067 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-host\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.417908 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.417366 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2r5lz\" (UniqueName: \"kubernetes.io/projected/0e855347-f4fa-493a-b42a-57f880bfc25d-kube-api-access-2r5lz\") pod \"iptables-alerter-8qp2b\" (UID: \"0e855347-f4fa-493a-b42a-57f880bfc25d\") " pod="openshift-network-operator/iptables-alerter-8qp2b" Apr 24 19:06:05.417908 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.417781 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/98fd67e7-d173-4139-9cd2-a4b311033089-registration-dir\") pod \"aws-ebs-csi-driver-node-c47zh\" (UID: \"98fd67e7-d173-4139-9cd2-a4b311033089\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.417908 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.417814 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-system-cni-dir\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.417908 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.417818 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e5bcfed1-92e6-4764-9896-4a9fc77aaef9-serviceca\") pod \"node-ca-ntjd2\" (UID: \"e5bcfed1-92e6-4764-9896-4a9fc77aaef9\") " pod="openshift-image-registry/node-ca-ntjd2" Apr 24 19:06:05.417908 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.417847 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-etc-modprobe-d\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.417908 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.417888 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-ovnkube-script-lib\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.417908 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.417908 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/98fd67e7-d173-4139-9cd2-a4b311033089-registration-dir\") pod \"aws-ebs-csi-driver-node-c47zh\" (UID: \"98fd67e7-d173-4139-9cd2-a4b311033089\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.418547 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.417965 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-etc-sysctl-d\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.418547 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.418228 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0e855347-f4fa-493a-b42a-57f880bfc25d-iptables-alerter-script\") pod \"iptables-alerter-8qp2b\" (UID: \"0e855347-f4fa-493a-b42a-57f880bfc25d\") " pod="openshift-network-operator/iptables-alerter-8qp2b" Apr 24 19:06:05.419585 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.419340 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ca30f8c7-a373-4425-93f6-cfa4c4634150-agent-certs\") pod \"konnectivity-agent-zz6t5\" (UID: \"ca30f8c7-a373-4425-93f6-cfa4c4634150\") " pod="kube-system/konnectivity-agent-zz6t5" Apr 24 19:06:05.424680 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.424655 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zxht\" (UniqueName: \"kubernetes.io/projected/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-kube-api-access-8zxht\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.424901 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.424883 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cfb1fffa-6d48-4ac4-ae37-ea4c1839473f-ovn-node-metrics-cert\") pod \"ovnkube-node-wt2vz\" (UID: \"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.426506 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:05.426484 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:05.426506 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:05.426509 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:05.426643 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:05.426522 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ddg5p for pod openshift-network-diagnostics/network-check-target-nrb24: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:05.426643 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:05.426612 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5236623-3273-4733-a194-9bfd58303272-kube-api-access-ddg5p podName:f5236623-3273-4733-a194-9bfd58303272 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:05.926595609 +0000 UTC m=+3.043935990 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ddg5p" (UniqueName: "kubernetes.io/projected/f5236623-3273-4733-a194-9bfd58303272-kube-api-access-ddg5p") pod "network-check-target-nrb24" (UID: "f5236623-3273-4733-a194-9bfd58303272") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:05.428783 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.428761 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zztmz\" (UniqueName: \"kubernetes.io/projected/b0872aa7-303f-4052-9d68-dd136609293b-kube-api-access-zztmz\") pod \"network-metrics-daemon-tf94j\" (UID: \"b0872aa7-303f-4052-9d68-dd136609293b\") " pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:05.428880 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.428815 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6mn5\" (UniqueName: \"kubernetes.io/projected/6b161493-342d-489a-a1b5-2d34fb7236d6-kube-api-access-h6mn5\") pod \"multus-additional-cni-plugins-t4krc\" (UID: \"6b161493-342d-489a-a1b5-2d34fb7236d6\") " pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.429105 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.429091 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6vbl\" (UniqueName: \"kubernetes.io/projected/e5bcfed1-92e6-4764-9896-4a9fc77aaef9-kube-api-access-g6vbl\") pod \"node-ca-ntjd2\" (UID: \"e5bcfed1-92e6-4764-9896-4a9fc77aaef9\") " pod="openshift-image-registry/node-ca-ntjd2" Apr 24 19:06:05.429976 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.429959 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r5lz\" (UniqueName: \"kubernetes.io/projected/0e855347-f4fa-493a-b42a-57f880bfc25d-kube-api-access-2r5lz\") pod \"iptables-alerter-8qp2b\" (UID: \"0e855347-f4fa-493a-b42a-57f880bfc25d\") " pod="openshift-network-operator/iptables-alerter-8qp2b" Apr 24 19:06:05.431218 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.431198 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpr7j\" (UniqueName: \"kubernetes.io/projected/98fd67e7-d173-4139-9cd2-a4b311033089-kube-api-access-qpr7j\") pod \"aws-ebs-csi-driver-node-c47zh\" (UID: \"98fd67e7-d173-4139-9cd2-a4b311033089\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.519160 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519123 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-hostroot\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.519160 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519165 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-host\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.519374 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519190 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-system-cni-dir\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.519374 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519240 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-hostroot\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.519374 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519261 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-host\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.519374 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519307 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-etc-modprobe-d\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.519374 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519321 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-system-cni-dir\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.519374 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519334 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-etc-sysctl-d\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.519374 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519356 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f546568-c9ad-4518-a7a9-893e659002a9-tmp\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.519651 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519382 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-host-var-lib-cni-bin\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.519651 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519402 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cfb7edc3-113a-4b12-83d1-66356304b80c-multus-daemon-config\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.519651 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519424 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-etc-kubernetes\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.519651 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519460 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-etc-modprobe-d\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.519651 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519472 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-multus-conf-dir\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.519651 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519465 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-etc-sysctl-d\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.519651 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519494 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-host-var-lib-cni-bin\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.519651 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519525 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-etc-kubernetes\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.519651 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519557 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-multus-conf-dir\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.519963 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519661 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-host-run-multus-certs\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.519963 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519685 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-host-var-lib-kubelet\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.519963 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519701 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8j65\" (UniqueName: \"kubernetes.io/projected/cfb7edc3-113a-4b12-83d1-66356304b80c-kube-api-access-j8j65\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.519963 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519742 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-host-run-multus-certs\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.519963 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519777 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cfb7edc3-113a-4b12-83d1-66356304b80c-cni-binary-copy\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.519963 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519803 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-etc-systemd\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.519963 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519808 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-host-var-lib-kubelet\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.519963 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519829 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-sys\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.519963 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519868 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-multus-socket-dir-parent\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.519963 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519894 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-host-run-k8s-cni-cncf-io\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.519963 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519921 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-multus-cni-dir\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.519963 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519870 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-sys\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.519963 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-etc-kubernetes\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.519963 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519925 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-etc-systemd\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.520485 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519981 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-host-run-k8s-cni-cncf-io\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.520485 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519998 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cfb7edc3-113a-4b12-83d1-66356304b80c-multus-daemon-config\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.520485 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.520004 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-multus-cni-dir\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.520485 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519982 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-run\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.520485 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.520033 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-run\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.520485 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.520040 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-multus-socket-dir-parent\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.520485 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.519986 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-etc-kubernetes\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.520485 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.520047 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ghkg\" (UniqueName: \"kubernetes.io/projected/7f546568-c9ad-4518-a7a9-893e659002a9-kube-api-access-5ghkg\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.520485 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.520228 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cfb7edc3-113a-4b12-83d1-66356304b80c-cni-binary-copy\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.520485 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.520230 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-os-release\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.520485 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.520264 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-cnibin\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.520485 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.520280 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-os-release\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.520485 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.520307 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-host-run-netns\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.520485 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.520324 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-host-var-lib-cni-multus\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.520485 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.520339 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-etc-sysconfig\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.520485 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.520357 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-etc-sysctl-conf\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.520485 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.520380 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-lib-modules\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.520485 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.520392 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-host-var-lib-cni-multus\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.521089 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.520402 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-cnibin\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.521089 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.520406 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-etc-sysconfig\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.521089 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.520424 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-var-lib-kubelet\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.521089 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.520468 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7f546568-c9ad-4518-a7a9-893e659002a9-etc-tuned\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.521089 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.520470 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cfb7edc3-113a-4b12-83d1-66356304b80c-host-run-netns\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.521089 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.520504 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-lib-modules\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.521089 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.520505 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-var-lib-kubelet\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.521089 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.520554 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7f546568-c9ad-4518-a7a9-893e659002a9-etc-sysctl-conf\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.521956 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.521940 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f546568-c9ad-4518-a7a9-893e659002a9-tmp\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.522585 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.522556 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7f546568-c9ad-4518-a7a9-893e659002a9-etc-tuned\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.527985 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.527961 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ghkg\" (UniqueName: \"kubernetes.io/projected/7f546568-c9ad-4518-a7a9-893e659002a9-kube-api-access-5ghkg\") pod \"tuned-6wv5x\" (UID: \"7f546568-c9ad-4518-a7a9-893e659002a9\") " pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.528199 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.528184 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8j65\" (UniqueName: \"kubernetes.io/projected/cfb7edc3-113a-4b12-83d1-66356304b80c-kube-api-access-j8j65\") pod \"multus-lbrbg\" (UID: \"cfb7edc3-113a-4b12-83d1-66356304b80c\") " pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.610334 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.610302 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ntjd2" Apr 24 19:06:05.618098 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.618075 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-t4krc" Apr 24 19:06:05.625844 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.625813 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8qp2b" Apr 24 19:06:05.631299 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.631273 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:05.636877 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.636856 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zz6t5" Apr 24 19:06:05.645455 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.645417 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" Apr 24 19:06:05.650982 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.650966 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" Apr 24 19:06:05.656557 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.656505 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lbrbg" Apr 24 19:06:05.906952 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:05.906923 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b161493_342d_489a_a1b5_2d34fb7236d6.slice/crio-1bb36206432b94aeabbe75546985c9ba6a4801dfdb8bbe629e76dde8db606921 WatchSource:0}: Error finding container 1bb36206432b94aeabbe75546985c9ba6a4801dfdb8bbe629e76dde8db606921: Status 404 returned error can't find the container with id 1bb36206432b94aeabbe75546985c9ba6a4801dfdb8bbe629e76dde8db606921 Apr 24 19:06:05.908794 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:05.908762 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca30f8c7_a373_4425_93f6_cfa4c4634150.slice/crio-3f852f9e070b849d5c24f1a3983c960d6cd65bbb577f2c53e6beea86b6aa908f WatchSource:0}: Error finding container 3f852f9e070b849d5c24f1a3983c960d6cd65bbb577f2c53e6beea86b6aa908f: Status 404 returned error can't find the container with id 3f852f9e070b849d5c24f1a3983c960d6cd65bbb577f2c53e6beea86b6aa908f Apr 24 19:06:05.910798 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:05.909880 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfb1fffa_6d48_4ac4_ae37_ea4c1839473f.slice/crio-89710b1d4b542c09a6e83f32dd71b3c2cfb1af5f111689fe560f2fc007e95b20 WatchSource:0}: Error finding container 89710b1d4b542c09a6e83f32dd71b3c2cfb1af5f111689fe560f2fc007e95b20: Status 404 returned error can't find the container with id 89710b1d4b542c09a6e83f32dd71b3c2cfb1af5f111689fe560f2fc007e95b20 Apr 24 19:06:05.911619 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:05.911578 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f546568_c9ad_4518_a7a9_893e659002a9.slice/crio-b081b328892dadfcd2e327ddb46c8bd34c6023bb268e795e14f44fd0076df82a WatchSource:0}: Error finding container b081b328892dadfcd2e327ddb46c8bd34c6023bb268e795e14f44fd0076df82a: Status 404 returned error can't find the container with id b081b328892dadfcd2e327ddb46c8bd34c6023bb268e795e14f44fd0076df82a Apr 24 19:06:05.913136 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:05.913116 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5bcfed1_92e6_4764_9896_4a9fc77aaef9.slice/crio-c8582979347802e6511417cc345a7c62720c5c2f5e5f9486fa2f14cf31bfd117 WatchSource:0}: Error finding container c8582979347802e6511417cc345a7c62720c5c2f5e5f9486fa2f14cf31bfd117: Status 404 returned error can't find the container with id c8582979347802e6511417cc345a7c62720c5c2f5e5f9486fa2f14cf31bfd117 Apr 24 19:06:05.913594 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:05.913569 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98fd67e7_d173_4139_9cd2_a4b311033089.slice/crio-c5682870be7ac763e56b728866f1f99b969ede454d18db6c9bb9159fa81bf0d4 WatchSource:0}: Error finding container c5682870be7ac763e56b728866f1f99b969ede454d18db6c9bb9159fa81bf0d4: Status 404 returned error can't find the container with id c5682870be7ac763e56b728866f1f99b969ede454d18db6c9bb9159fa81bf0d4 Apr 24 19:06:05.914457 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:05.914341 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfb7edc3_113a_4b12_83d1_66356304b80c.slice/crio-112965d787050d2cfc841f8467a31b2f2e1d5e2dcd39d684467961375bed04a4 WatchSource:0}: Error finding container 112965d787050d2cfc841f8467a31b2f2e1d5e2dcd39d684467961375bed04a4: Status 404 returned error can't find the container with id 112965d787050d2cfc841f8467a31b2f2e1d5e2dcd39d684467961375bed04a4 Apr 24 19:06:05.916012 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:05.915324 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e855347_f4fa_493a_b42a_57f880bfc25d.slice/crio-6a9e1dbe7638aa12edbb990942d0484c166072cc3a32ceaab2c485d21279612f WatchSource:0}: Error finding container 6a9e1dbe7638aa12edbb990942d0484c166072cc3a32ceaab2c485d21279612f: Status 404 returned error can't find the container with id 6a9e1dbe7638aa12edbb990942d0484c166072cc3a32ceaab2c485d21279612f Apr 24 19:06:05.923287 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:05.923267 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs\") pod \"network-metrics-daemon-tf94j\" (UID: \"b0872aa7-303f-4052-9d68-dd136609293b\") " pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:05.923391 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:05.923368 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:05.923529 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:05.923419 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs podName:b0872aa7-303f-4052-9d68-dd136609293b nodeName:}" failed. No retries permitted until 2026-04-24 19:06:06.923402001 +0000 UTC m=+4.040742380 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs") pod "network-metrics-daemon-tf94j" (UID: "b0872aa7-303f-4052-9d68-dd136609293b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:06.023712 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:06.023684 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddg5p\" (UniqueName: \"kubernetes.io/projected/f5236623-3273-4733-a194-9bfd58303272-kube-api-access-ddg5p\") pod \"network-check-target-nrb24\" (UID: \"f5236623-3273-4733-a194-9bfd58303272\") " pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:06.023823 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:06.023811 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:06.023858 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:06.023827 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:06.023858 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:06.023837 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ddg5p for pod openshift-network-diagnostics/network-check-target-nrb24: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:06.023938 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:06.023880 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5236623-3273-4733-a194-9bfd58303272-kube-api-access-ddg5p podName:f5236623-3273-4733-a194-9bfd58303272 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:07.023867901 +0000 UTC m=+4.141208268 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ddg5p" (UniqueName: "kubernetes.io/projected/f5236623-3273-4733-a194-9bfd58303272-kube-api-access-ddg5p") pod "network-check-target-nrb24" (UID: "f5236623-3273-4733-a194-9bfd58303272") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:06.341763 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:06.341658 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 19:01:04 +0000 UTC" deadline="2027-11-26 08:46:40.157803081 +0000 UTC" Apr 24 19:06:06.341763 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:06.341695 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13933h40m33.816111594s" Apr 24 19:06:06.387891 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:06.387851 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zz6t5" event={"ID":"ca30f8c7-a373-4425-93f6-cfa4c4634150","Type":"ContainerStarted","Data":"3f852f9e070b849d5c24f1a3983c960d6cd65bbb577f2c53e6beea86b6aa908f"} Apr 24 19:06:06.393397 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:06.393363 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t4krc" event={"ID":"6b161493-342d-489a-a1b5-2d34fb7236d6","Type":"ContainerStarted","Data":"1bb36206432b94aeabbe75546985c9ba6a4801dfdb8bbe629e76dde8db606921"} Apr 24 19:06:06.400865 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:06.400832 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-6.ec2.internal" event={"ID":"946221ebdc75fe55155252abed2eec40","Type":"ContainerStarted","Data":"246ec2e9cda80b40c171475e9aeb9df5402376585cead473600b3299e831f4a1"} Apr 24 19:06:06.403293 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:06.403239 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ntjd2" event={"ID":"e5bcfed1-92e6-4764-9896-4a9fc77aaef9","Type":"ContainerStarted","Data":"c8582979347802e6511417cc345a7c62720c5c2f5e5f9486fa2f14cf31bfd117"} Apr 24 19:06:06.410761 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:06.410730 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8qp2b" event={"ID":"0e855347-f4fa-493a-b42a-57f880bfc25d","Type":"ContainerStarted","Data":"6a9e1dbe7638aa12edbb990942d0484c166072cc3a32ceaab2c485d21279612f"} Apr 24 19:06:06.412846 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:06.412819 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" event={"ID":"98fd67e7-d173-4139-9cd2-a4b311033089","Type":"ContainerStarted","Data":"c5682870be7ac763e56b728866f1f99b969ede454d18db6c9bb9159fa81bf0d4"} Apr 24 19:06:06.417014 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:06.416986 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" event={"ID":"7f546568-c9ad-4518-a7a9-893e659002a9","Type":"ContainerStarted","Data":"b081b328892dadfcd2e327ddb46c8bd34c6023bb268e795e14f44fd0076df82a"} Apr 24 19:06:06.423937 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:06.423911 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lbrbg" event={"ID":"cfb7edc3-113a-4b12-83d1-66356304b80c","Type":"ContainerStarted","Data":"112965d787050d2cfc841f8467a31b2f2e1d5e2dcd39d684467961375bed04a4"} Apr 24 19:06:06.430249 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:06.430223 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" event={"ID":"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f","Type":"ContainerStarted","Data":"89710b1d4b542c09a6e83f32dd71b3c2cfb1af5f111689fe560f2fc007e95b20"} Apr 24 19:06:06.931768 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:06.931685 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs\") pod \"network-metrics-daemon-tf94j\" (UID: \"b0872aa7-303f-4052-9d68-dd136609293b\") " pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:06.931912 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:06.931864 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:06.931974 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:06.931927 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs podName:b0872aa7-303f-4052-9d68-dd136609293b nodeName:}" failed. No retries permitted until 2026-04-24 19:06:08.931908849 +0000 UTC m=+6.049249229 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs") pod "network-metrics-daemon-tf94j" (UID: "b0872aa7-303f-4052-9d68-dd136609293b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:07.033025 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:07.032988 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddg5p\" (UniqueName: \"kubernetes.io/projected/f5236623-3273-4733-a194-9bfd58303272-kube-api-access-ddg5p\") pod \"network-check-target-nrb24\" (UID: \"f5236623-3273-4733-a194-9bfd58303272\") " pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:07.033199 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:07.033179 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:07.033199 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:07.033198 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:07.033325 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:07.033210 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ddg5p for pod openshift-network-diagnostics/network-check-target-nrb24: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:07.033325 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:07.033279 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5236623-3273-4733-a194-9bfd58303272-kube-api-access-ddg5p podName:f5236623-3273-4733-a194-9bfd58303272 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:09.03325106 +0000 UTC m=+6.150591429 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ddg5p" (UniqueName: "kubernetes.io/projected/f5236623-3273-4733-a194-9bfd58303272-kube-api-access-ddg5p") pod "network-check-target-nrb24" (UID: "f5236623-3273-4733-a194-9bfd58303272") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:07.378290 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:07.377574 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:07.378290 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:07.377706 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tf94j" podUID="b0872aa7-303f-4052-9d68-dd136609293b" Apr 24 19:06:07.378290 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:07.378109 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:07.378290 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:07.378194 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nrb24" podUID="f5236623-3273-4733-a194-9bfd58303272" Apr 24 19:06:07.442002 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:07.441471 2573 generic.go:358] "Generic (PLEG): container finished" podID="d8ea1cce20b14f0d5ffac11d636ad6aa" containerID="88134937aa8fc86edde542daa53ea022c1493e66c1f850ffc82d2ff9a1e11cf3" exitCode=0 Apr 24 19:06:07.442002 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:07.441595 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-6.ec2.internal" event={"ID":"d8ea1cce20b14f0d5ffac11d636ad6aa","Type":"ContainerDied","Data":"88134937aa8fc86edde542daa53ea022c1493e66c1f850ffc82d2ff9a1e11cf3"} Apr 24 19:06:07.457094 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:07.457042 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-6.ec2.internal" podStartSLOduration=3.457027098 podStartE2EDuration="3.457027098s" podCreationTimestamp="2026-04-24 19:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:06:06.41551793 +0000 UTC m=+3.532858322" watchObservedRunningTime="2026-04-24 19:06:07.457027098 +0000 UTC m=+4.574367487" Apr 24 19:06:08.447196 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:08.447160 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-6.ec2.internal" event={"ID":"d8ea1cce20b14f0d5ffac11d636ad6aa","Type":"ContainerStarted","Data":"d5f91dc8ddb2b6193b2e3ce752d6c40dcc0959b1aa22c26c5af9b0584ce6ce51"} Apr 24 19:06:08.460409 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:08.460358 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-6.ec2.internal" podStartSLOduration=4.460340302 podStartE2EDuration="4.460340302s" podCreationTimestamp="2026-04-24 19:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:06:08.460214032 +0000 UTC m=+5.577554432" watchObservedRunningTime="2026-04-24 19:06:08.460340302 +0000 UTC m=+5.577680690" Apr 24 19:06:08.947637 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:08.946860 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs\") pod \"network-metrics-daemon-tf94j\" (UID: \"b0872aa7-303f-4052-9d68-dd136609293b\") " pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:08.947637 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:08.947038 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:08.947637 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:08.947104 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs podName:b0872aa7-303f-4052-9d68-dd136609293b nodeName:}" failed. No retries permitted until 2026-04-24 19:06:12.947084829 +0000 UTC m=+10.064425197 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs") pod "network-metrics-daemon-tf94j" (UID: "b0872aa7-303f-4052-9d68-dd136609293b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:09.048530 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:09.047862 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddg5p\" (UniqueName: \"kubernetes.io/projected/f5236623-3273-4733-a194-9bfd58303272-kube-api-access-ddg5p\") pod \"network-check-target-nrb24\" (UID: \"f5236623-3273-4733-a194-9bfd58303272\") " pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:09.048530 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:09.048042 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:09.048530 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:09.048060 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:09.048530 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:09.048072 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ddg5p for pod openshift-network-diagnostics/network-check-target-nrb24: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:09.048530 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:09.048133 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5236623-3273-4733-a194-9bfd58303272-kube-api-access-ddg5p podName:f5236623-3273-4733-a194-9bfd58303272 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:13.048114933 +0000 UTC m=+10.165455304 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ddg5p" (UniqueName: "kubernetes.io/projected/f5236623-3273-4733-a194-9bfd58303272-kube-api-access-ddg5p") pod "network-check-target-nrb24" (UID: "f5236623-3273-4733-a194-9bfd58303272") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:09.376293 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:09.376107 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:09.376293 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:09.376222 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nrb24" podUID="f5236623-3273-4733-a194-9bfd58303272" Apr 24 19:06:09.376293 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:09.376280 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:09.376528 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:09.376333 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tf94j" podUID="b0872aa7-303f-4052-9d68-dd136609293b" Apr 24 19:06:11.375951 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:11.375916 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:11.376338 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:11.376054 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nrb24" podUID="f5236623-3273-4733-a194-9bfd58303272" Apr 24 19:06:11.376417 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:11.376398 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:11.376581 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:11.376555 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tf94j" podUID="b0872aa7-303f-4052-9d68-dd136609293b" Apr 24 19:06:12.980572 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:12.980524 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs\") pod \"network-metrics-daemon-tf94j\" (UID: \"b0872aa7-303f-4052-9d68-dd136609293b\") " pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:12.981032 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:12.980697 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:12.981032 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:12.980757 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs podName:b0872aa7-303f-4052-9d68-dd136609293b nodeName:}" failed. No retries permitted until 2026-04-24 19:06:20.980742418 +0000 UTC m=+18.098082807 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs") pod "network-metrics-daemon-tf94j" (UID: "b0872aa7-303f-4052-9d68-dd136609293b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:13.080890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:13.080854 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddg5p\" (UniqueName: \"kubernetes.io/projected/f5236623-3273-4733-a194-9bfd58303272-kube-api-access-ddg5p\") pod \"network-check-target-nrb24\" (UID: \"f5236623-3273-4733-a194-9bfd58303272\") " pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:13.081063 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:13.081038 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:13.081114 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:13.081066 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:13.081114 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:13.081079 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ddg5p for pod openshift-network-diagnostics/network-check-target-nrb24: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:13.081175 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:13.081141 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5236623-3273-4733-a194-9bfd58303272-kube-api-access-ddg5p podName:f5236623-3273-4733-a194-9bfd58303272 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:21.081120669 +0000 UTC m=+18.198461049 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ddg5p" (UniqueName: "kubernetes.io/projected/f5236623-3273-4733-a194-9bfd58303272-kube-api-access-ddg5p") pod "network-check-target-nrb24" (UID: "f5236623-3273-4733-a194-9bfd58303272") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:13.377327 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:13.377290 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:13.377505 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:13.377376 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nrb24" podUID="f5236623-3273-4733-a194-9bfd58303272" Apr 24 19:06:13.377505 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:13.377486 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:13.377650 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:13.377626 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tf94j" podUID="b0872aa7-303f-4052-9d68-dd136609293b" Apr 24 19:06:15.375424 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:15.375337 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:15.375811 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:15.375466 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nrb24" podUID="f5236623-3273-4733-a194-9bfd58303272" Apr 24 19:06:15.375811 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:15.375525 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:15.375811 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:15.375646 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tf94j" podUID="b0872aa7-303f-4052-9d68-dd136609293b" Apr 24 19:06:17.375821 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:17.375776 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:17.375821 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:17.375804 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:17.376306 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:17.375910 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nrb24" podUID="f5236623-3273-4733-a194-9bfd58303272" Apr 24 19:06:17.376306 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:17.376012 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tf94j" podUID="b0872aa7-303f-4052-9d68-dd136609293b" Apr 24 19:06:19.375942 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:19.375910 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:19.376506 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:19.376036 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nrb24" podUID="f5236623-3273-4733-a194-9bfd58303272" Apr 24 19:06:19.376506 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:19.376050 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:19.376506 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:19.376174 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tf94j" podUID="b0872aa7-303f-4052-9d68-dd136609293b" Apr 24 19:06:21.037692 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:21.037654 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs\") pod \"network-metrics-daemon-tf94j\" (UID: \"b0872aa7-303f-4052-9d68-dd136609293b\") " pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:21.038152 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:21.037777 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:21.038152 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:21.037838 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs podName:b0872aa7-303f-4052-9d68-dd136609293b nodeName:}" failed. No retries permitted until 2026-04-24 19:06:37.037823575 +0000 UTC m=+34.155163941 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs") pod "network-metrics-daemon-tf94j" (UID: "b0872aa7-303f-4052-9d68-dd136609293b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:21.138960 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:21.138923 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddg5p\" (UniqueName: \"kubernetes.io/projected/f5236623-3273-4733-a194-9bfd58303272-kube-api-access-ddg5p\") pod \"network-check-target-nrb24\" (UID: \"f5236623-3273-4733-a194-9bfd58303272\") " pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:21.139133 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:21.139103 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:21.139133 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:21.139128 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:21.139231 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:21.139141 2573 projected.go:194] Error preparing data for projected volume kube-api-access-ddg5p for pod openshift-network-diagnostics/network-check-target-nrb24: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:21.139231 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:21.139199 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5236623-3273-4733-a194-9bfd58303272-kube-api-access-ddg5p podName:f5236623-3273-4733-a194-9bfd58303272 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:37.139179849 +0000 UTC m=+34.256520217 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ddg5p" (UniqueName: "kubernetes.io/projected/f5236623-3273-4733-a194-9bfd58303272-kube-api-access-ddg5p") pod "network-check-target-nrb24" (UID: "f5236623-3273-4733-a194-9bfd58303272") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:21.376036 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:21.375924 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:21.376036 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:21.375957 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:21.376223 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:21.376065 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tf94j" podUID="b0872aa7-303f-4052-9d68-dd136609293b" Apr 24 19:06:21.376223 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:21.376203 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nrb24" podUID="f5236623-3273-4733-a194-9bfd58303272" Apr 24 19:06:23.385704 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:23.385366 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:23.386192 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:23.385739 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nrb24" podUID="f5236623-3273-4733-a194-9bfd58303272" Apr 24 19:06:23.386192 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:23.385425 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:23.386192 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:23.386155 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tf94j" podUID="b0872aa7-303f-4052-9d68-dd136609293b" Apr 24 19:06:23.473894 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:23.473865 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ntjd2" event={"ID":"e5bcfed1-92e6-4764-9896-4a9fc77aaef9","Type":"ContainerStarted","Data":"7a4f9661fabd3fa297d2144f046978004197191d9c66a0137bb4bee1b1d22b1c"} Apr 24 19:06:23.474974 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:23.474952 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" event={"ID":"98fd67e7-d173-4139-9cd2-a4b311033089","Type":"ContainerStarted","Data":"4e60069f27e9a122616888c8f95a5b7607beca72a2ffe6c1e499a5b629493fcf"} Apr 24 19:06:23.475924 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:23.475902 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" event={"ID":"7f546568-c9ad-4518-a7a9-893e659002a9","Type":"ContainerStarted","Data":"64ff5c2f997f6b8790798c88842f37b73ad8e8378f9392998957ff904a72dd28"} Apr 24 19:06:23.476996 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:23.476960 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lbrbg" event={"ID":"cfb7edc3-113a-4b12-83d1-66356304b80c","Type":"ContainerStarted","Data":"850e5d56be9c06ea99cbcf77f02b5f0a16e5a050c7c909b11e6c390013936134"} Apr 24 19:06:23.478046 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:23.478014 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zz6t5" event={"ID":"ca30f8c7-a373-4425-93f6-cfa4c4634150","Type":"ContainerStarted","Data":"ec1c9f7ec70adece73290cdabffb3898fc989a04b9fe419271244a0cce4a029e"} Apr 24 19:06:23.479118 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:23.479098 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t4krc" event={"ID":"6b161493-342d-489a-a1b5-2d34fb7236d6","Type":"ContainerStarted","Data":"5e3c8352641286321910ad06dd1ca8bbdd7fae3818a38c9604806cdd79d596fd"} Apr 24 19:06:23.503587 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:23.503539 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ntjd2" podStartSLOduration=3.292641344 podStartE2EDuration="20.503520489s" podCreationTimestamp="2026-04-24 19:06:03 +0000 UTC" firstStartedPulling="2026-04-24 19:06:05.915527453 +0000 UTC m=+3.032867821" lastFinishedPulling="2026-04-24 19:06:23.126406597 +0000 UTC m=+20.243746966" observedRunningTime="2026-04-24 19:06:23.486201265 +0000 UTC m=+20.603541656" watchObservedRunningTime="2026-04-24 19:06:23.503520489 +0000 UTC m=+20.620860875" Apr 24 19:06:23.504364 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:23.504324 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lbrbg" podStartSLOduration=3.238191447 podStartE2EDuration="20.504312854s" podCreationTimestamp="2026-04-24 19:06:03 +0000 UTC" firstStartedPulling="2026-04-24 19:06:05.91707425 +0000 UTC m=+3.034414631" lastFinishedPulling="2026-04-24 19:06:23.183195669 +0000 UTC m=+20.300536038" observedRunningTime="2026-04-24 19:06:23.503594697 +0000 UTC m=+20.620935095" watchObservedRunningTime="2026-04-24 19:06:23.504312854 +0000 UTC m=+20.621653243" Apr 24 19:06:23.534895 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:23.534843 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6wv5x" podStartSLOduration=3.265438975 podStartE2EDuration="20.534827543s" podCreationTimestamp="2026-04-24 19:06:03 +0000 UTC" firstStartedPulling="2026-04-24 19:06:05.913759528 +0000 UTC m=+3.031099899" lastFinishedPulling="2026-04-24 19:06:23.183148085 +0000 UTC m=+20.300488467" observedRunningTime="2026-04-24 19:06:23.534295124 +0000 UTC m=+20.651635513" watchObservedRunningTime="2026-04-24 19:06:23.534827543 +0000 UTC m=+20.652167931" Apr 24 19:06:23.548856 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:23.548626 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-zz6t5" podStartSLOduration=8.115328328 podStartE2EDuration="20.548606566s" podCreationTimestamp="2026-04-24 19:06:03 +0000 UTC" firstStartedPulling="2026-04-24 19:06:05.911062456 +0000 UTC m=+3.028402821" lastFinishedPulling="2026-04-24 19:06:18.344340691 +0000 UTC m=+15.461681059" observedRunningTime="2026-04-24 19:06:23.54778672 +0000 UTC m=+20.665127110" watchObservedRunningTime="2026-04-24 19:06:23.548606566 +0000 UTC m=+20.665946956" Apr 24 19:06:24.343293 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:24.343270 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 19:06:24.387247 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:24.387153 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T19:06:24.343289836Z","UUID":"d3f202d2-2109-4de6-bb1a-9792dea3a529","Handler":null,"Name":"","Endpoint":""} Apr 24 19:06:24.388611 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:24.388593 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 19:06:24.388697 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:24.388617 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 19:06:24.481891 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:24.481859 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8qp2b" event={"ID":"0e855347-f4fa-493a-b42a-57f880bfc25d","Type":"ContainerStarted","Data":"fc8b8d7a7d8d533a6fa22d4c65c3aea820dc972e395dcdd5dfc8c59131cf247c"} Apr 24 19:06:24.483259 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:24.483234 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" event={"ID":"98fd67e7-d173-4139-9cd2-a4b311033089","Type":"ContainerStarted","Data":"dc0438464b47cc2e8586f66a637aee9579e55892bc32e7920d8935fb7abbfd08"} Apr 24 19:06:24.485510 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:24.485490 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" event={"ID":"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f","Type":"ContainerStarted","Data":"4789b5a371c5ee9b82c4bf73136bf283ef9313b5bab3c5d2423272469c598204"} Apr 24 19:06:24.485606 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:24.485519 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" event={"ID":"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f","Type":"ContainerStarted","Data":"620e66263b42cf3e419f711c03585d238856ea248a371284d7ba4dea70063c3c"} Apr 24 19:06:24.485606 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:24.485535 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" event={"ID":"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f","Type":"ContainerStarted","Data":"4777d03e6861b1e569caf2c46a6a0f3ca57519d83d0249950fd330dc7eb30116"} Apr 24 19:06:24.485606 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:24.485552 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" event={"ID":"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f","Type":"ContainerStarted","Data":"072a0867924f8c694c25125ff8dbeecb9fdd869f6432f647709d9725f40b43c5"} Apr 24 19:06:24.485606 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:24.485564 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" event={"ID":"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f","Type":"ContainerStarted","Data":"1b379de67d579493c5a64a83b9f81f71fd5feb6ca9e3756a3898d01ec03e9ba2"} Apr 24 19:06:24.485606 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:24.485575 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" event={"ID":"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f","Type":"ContainerStarted","Data":"80b0f4a785fceb3cc58773b4d105dfa5bdebdf0b8ee2ab7fe19ff1cac0934662"} Apr 24 19:06:24.486700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:24.486680 2573 generic.go:358] "Generic (PLEG): container finished" podID="6b161493-342d-489a-a1b5-2d34fb7236d6" containerID="5e3c8352641286321910ad06dd1ca8bbdd7fae3818a38c9604806cdd79d596fd" exitCode=0 Apr 24 19:06:24.486801 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:24.486756 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t4krc" event={"ID":"6b161493-342d-489a-a1b5-2d34fb7236d6","Type":"ContainerDied","Data":"5e3c8352641286321910ad06dd1ca8bbdd7fae3818a38c9604806cdd79d596fd"} Apr 24 19:06:24.514525 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:24.514484 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-8qp2b" podStartSLOduration=4.251482718 podStartE2EDuration="21.514473888s" podCreationTimestamp="2026-04-24 19:06:03 +0000 UTC" firstStartedPulling="2026-04-24 19:06:05.918144331 +0000 UTC m=+3.035484696" lastFinishedPulling="2026-04-24 19:06:23.181135483 +0000 UTC m=+20.298475866" observedRunningTime="2026-04-24 19:06:24.496858984 +0000 UTC m=+21.614199371" watchObservedRunningTime="2026-04-24 19:06:24.514473888 +0000 UTC m=+21.631814275" Apr 24 19:06:25.246583 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:25.246548 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-zz6t5" Apr 24 19:06:25.247275 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:25.247250 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-zz6t5" Apr 24 19:06:25.376062 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:25.376035 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:25.376214 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:25.376178 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tf94j" podUID="b0872aa7-303f-4052-9d68-dd136609293b" Apr 24 19:06:25.376602 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:25.376579 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:25.376703 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:25.376671 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nrb24" podUID="f5236623-3273-4733-a194-9bfd58303272" Apr 24 19:06:25.491103 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:25.490907 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" event={"ID":"98fd67e7-d173-4139-9cd2-a4b311033089","Type":"ContainerStarted","Data":"3b207843e5e3006fc10d0bc00cea1d0802c605e8d06d4948241c3b2940f56d5d"} Apr 24 19:06:25.515713 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:25.512935 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c47zh" podStartSLOduration=3.12493297 podStartE2EDuration="22.51292165s" podCreationTimestamp="2026-04-24 19:06:03 +0000 UTC" firstStartedPulling="2026-04-24 19:06:05.916056541 +0000 UTC m=+3.033396907" lastFinishedPulling="2026-04-24 19:06:25.304045216 +0000 UTC m=+22.421385587" observedRunningTime="2026-04-24 19:06:25.512139985 +0000 UTC m=+22.629480374" watchObservedRunningTime="2026-04-24 19:06:25.51292165 +0000 UTC m=+22.630262054" Apr 24 19:06:26.495733 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:26.495688 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" event={"ID":"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f","Type":"ContainerStarted","Data":"4a1c3666e7d7c6440fae71aa7d0c8653acc488e50a6c85708cb0e8db74ad114b"} Apr 24 19:06:26.496191 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:26.495779 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 19:06:27.375265 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:27.375223 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:27.375265 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:27.375239 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:27.375565 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:27.375350 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tf94j" podUID="b0872aa7-303f-4052-9d68-dd136609293b" Apr 24 19:06:27.375565 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:27.375475 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nrb24" podUID="f5236623-3273-4733-a194-9bfd58303272" Apr 24 19:06:28.042523 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:28.042477 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-zz6t5" Apr 24 19:06:28.043050 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:28.042607 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 19:06:28.043202 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:28.043185 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-zz6t5" Apr 24 19:06:28.502338 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:28.502170 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" event={"ID":"cfb1fffa-6d48-4ac4-ae37-ea4c1839473f","Type":"ContainerStarted","Data":"5f2b4af86446120a0803ed1b663d045c989a13d8a7a9de7a746610b173501cf4"} Apr 24 19:06:28.507407 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:28.507383 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:28.523737 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:28.523714 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:28.537845 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:28.537796 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" podStartSLOduration=8.030980894 podStartE2EDuration="25.537778081s" podCreationTimestamp="2026-04-24 19:06:03 +0000 UTC" firstStartedPulling="2026-04-24 19:06:05.912880581 +0000 UTC m=+3.030220947" lastFinishedPulling="2026-04-24 19:06:23.419677764 +0000 UTC m=+20.537018134" observedRunningTime="2026-04-24 19:06:28.535645585 +0000 UTC m=+25.652985988" watchObservedRunningTime="2026-04-24 19:06:28.537778081 +0000 UTC m=+25.655118469" Apr 24 19:06:29.375912 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:29.375878 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:29.376755 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:29.375882 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:29.376755 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:29.375967 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nrb24" podUID="f5236623-3273-4733-a194-9bfd58303272" Apr 24 19:06:29.376755 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:29.376047 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tf94j" podUID="b0872aa7-303f-4052-9d68-dd136609293b" Apr 24 19:06:29.504914 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:29.504879 2573 generic.go:358] "Generic (PLEG): container finished" podID="6b161493-342d-489a-a1b5-2d34fb7236d6" containerID="3d1ff5db92f33741ac267a3c11057203faa6c6553b69d4e973e234ee7ab8f97b" exitCode=0 Apr 24 19:06:29.505078 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:29.504970 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t4krc" event={"ID":"6b161493-342d-489a-a1b5-2d34fb7236d6","Type":"ContainerDied","Data":"3d1ff5db92f33741ac267a3c11057203faa6c6553b69d4e973e234ee7ab8f97b"} Apr 24 19:06:29.505125 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:29.505079 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 19:06:29.505479 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:29.505456 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:29.520269 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:29.520247 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:29.905766 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:29.905733 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-b5mbc"] Apr 24 19:06:29.913220 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:29.913184 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b5mbc" Apr 24 19:06:29.916530 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:29.915758 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 19:06:29.916530 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:29.915802 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 19:06:29.916530 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:29.915846 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-rgrrt\"" Apr 24 19:06:30.002612 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.002580 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94kwj\" (UniqueName: \"kubernetes.io/projected/3ae0b497-fe7c-4446-ba6a-1177ca3da41b-kube-api-access-94kwj\") pod \"node-resolver-b5mbc\" (UID: \"3ae0b497-fe7c-4446-ba6a-1177ca3da41b\") " pod="openshift-dns/node-resolver-b5mbc" Apr 24 19:06:30.002612 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.002614 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3ae0b497-fe7c-4446-ba6a-1177ca3da41b-hosts-file\") pod \"node-resolver-b5mbc\" (UID: \"3ae0b497-fe7c-4446-ba6a-1177ca3da41b\") " pod="openshift-dns/node-resolver-b5mbc" Apr 24 19:06:30.002845 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.002646 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ae0b497-fe7c-4446-ba6a-1177ca3da41b-tmp-dir\") pod \"node-resolver-b5mbc\" (UID: \"3ae0b497-fe7c-4446-ba6a-1177ca3da41b\") " pod="openshift-dns/node-resolver-b5mbc" Apr 24 19:06:30.103044 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.103010 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94kwj\" (UniqueName: \"kubernetes.io/projected/3ae0b497-fe7c-4446-ba6a-1177ca3da41b-kube-api-access-94kwj\") pod \"node-resolver-b5mbc\" (UID: \"3ae0b497-fe7c-4446-ba6a-1177ca3da41b\") " pod="openshift-dns/node-resolver-b5mbc" Apr 24 19:06:30.103295 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.103053 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3ae0b497-fe7c-4446-ba6a-1177ca3da41b-hosts-file\") pod \"node-resolver-b5mbc\" (UID: \"3ae0b497-fe7c-4446-ba6a-1177ca3da41b\") " pod="openshift-dns/node-resolver-b5mbc" Apr 24 19:06:30.103295 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.103081 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ae0b497-fe7c-4446-ba6a-1177ca3da41b-tmp-dir\") pod \"node-resolver-b5mbc\" (UID: \"3ae0b497-fe7c-4446-ba6a-1177ca3da41b\") " pod="openshift-dns/node-resolver-b5mbc" Apr 24 19:06:30.103295 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.103254 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3ae0b497-fe7c-4446-ba6a-1177ca3da41b-hosts-file\") pod \"node-resolver-b5mbc\" (UID: \"3ae0b497-fe7c-4446-ba6a-1177ca3da41b\") " pod="openshift-dns/node-resolver-b5mbc" Apr 24 19:06:30.103499 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.103455 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ae0b497-fe7c-4446-ba6a-1177ca3da41b-tmp-dir\") pod \"node-resolver-b5mbc\" (UID: \"3ae0b497-fe7c-4446-ba6a-1177ca3da41b\") " pod="openshift-dns/node-resolver-b5mbc" Apr 24 19:06:30.114887 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.114858 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94kwj\" (UniqueName: \"kubernetes.io/projected/3ae0b497-fe7c-4446-ba6a-1177ca3da41b-kube-api-access-94kwj\") pod \"node-resolver-b5mbc\" (UID: \"3ae0b497-fe7c-4446-ba6a-1177ca3da41b\") " pod="openshift-dns/node-resolver-b5mbc" Apr 24 19:06:30.223965 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.223941 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b5mbc" Apr 24 19:06:30.296398 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.294611 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-wv5cl"] Apr 24 19:06:30.300511 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.300486 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wv5cl" Apr 24 19:06:30.300617 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:30.300564 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wv5cl" podUID="7e7fa504-1a33-465d-aa64-5131b6adcc9f" Apr 24 19:06:30.331812 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.331286 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-nrb24"] Apr 24 19:06:30.331812 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.331395 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:30.331812 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:30.331506 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nrb24" podUID="f5236623-3273-4733-a194-9bfd58303272" Apr 24 19:06:30.333563 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.333536 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tf94j"] Apr 24 19:06:30.333836 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.333664 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:30.333836 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:30.333770 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tf94j" podUID="b0872aa7-303f-4052-9d68-dd136609293b" Apr 24 19:06:30.352890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.352856 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wv5cl"] Apr 24 19:06:30.405233 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.405206 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7e7fa504-1a33-465d-aa64-5131b6adcc9f-dbus\") pod \"global-pull-secret-syncer-wv5cl\" (UID: \"7e7fa504-1a33-465d-aa64-5131b6adcc9f\") " pod="kube-system/global-pull-secret-syncer-wv5cl" Apr 24 19:06:30.405594 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.405270 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7e7fa504-1a33-465d-aa64-5131b6adcc9f-kubelet-config\") pod \"global-pull-secret-syncer-wv5cl\" (UID: \"7e7fa504-1a33-465d-aa64-5131b6adcc9f\") " pod="kube-system/global-pull-secret-syncer-wv5cl" Apr 24 19:06:30.405594 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.405361 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7e7fa504-1a33-465d-aa64-5131b6adcc9f-original-pull-secret\") pod \"global-pull-secret-syncer-wv5cl\" (UID: \"7e7fa504-1a33-465d-aa64-5131b6adcc9f\") " pod="kube-system/global-pull-secret-syncer-wv5cl" Apr 24 19:06:30.505736 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.505657 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7e7fa504-1a33-465d-aa64-5131b6adcc9f-dbus\") pod \"global-pull-secret-syncer-wv5cl\" (UID: \"7e7fa504-1a33-465d-aa64-5131b6adcc9f\") " pod="kube-system/global-pull-secret-syncer-wv5cl" Apr 24 19:06:30.505736 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.505703 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7e7fa504-1a33-465d-aa64-5131b6adcc9f-kubelet-config\") pod \"global-pull-secret-syncer-wv5cl\" (UID: \"7e7fa504-1a33-465d-aa64-5131b6adcc9f\") " pod="kube-system/global-pull-secret-syncer-wv5cl" Apr 24 19:06:30.505922 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.505748 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7e7fa504-1a33-465d-aa64-5131b6adcc9f-original-pull-secret\") pod \"global-pull-secret-syncer-wv5cl\" (UID: \"7e7fa504-1a33-465d-aa64-5131b6adcc9f\") " pod="kube-system/global-pull-secret-syncer-wv5cl" Apr 24 19:06:30.505922 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.505855 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7e7fa504-1a33-465d-aa64-5131b6adcc9f-kubelet-config\") pod \"global-pull-secret-syncer-wv5cl\" (UID: \"7e7fa504-1a33-465d-aa64-5131b6adcc9f\") " pod="kube-system/global-pull-secret-syncer-wv5cl" Apr 24 19:06:30.505922 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:30.505864 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:30.505922 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:30.505917 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e7fa504-1a33-465d-aa64-5131b6adcc9f-original-pull-secret podName:7e7fa504-1a33-465d-aa64-5131b6adcc9f nodeName:}" failed. No retries permitted until 2026-04-24 19:06:31.005903447 +0000 UTC m=+28.123243813 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7e7fa504-1a33-465d-aa64-5131b6adcc9f-original-pull-secret") pod "global-pull-secret-syncer-wv5cl" (UID: "7e7fa504-1a33-465d-aa64-5131b6adcc9f") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:30.506054 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.505937 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7e7fa504-1a33-465d-aa64-5131b6adcc9f-dbus\") pod \"global-pull-secret-syncer-wv5cl\" (UID: \"7e7fa504-1a33-465d-aa64-5131b6adcc9f\") " pod="kube-system/global-pull-secret-syncer-wv5cl" Apr 24 19:06:30.507842 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.507813 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b5mbc" event={"ID":"3ae0b497-fe7c-4446-ba6a-1177ca3da41b","Type":"ContainerStarted","Data":"13094844c50d6fad03bd171b5690209be526a057e8e052bee32e5728e986e4b1"} Apr 24 19:06:30.507961 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.507848 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b5mbc" event={"ID":"3ae0b497-fe7c-4446-ba6a-1177ca3da41b","Type":"ContainerStarted","Data":"236463db2546e61abb514121172daca7883f9d85b5581b309f36f704027f71e3"} Apr 24 19:06:30.509495 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.509473 2573 generic.go:358] "Generic (PLEG): container finished" podID="6b161493-342d-489a-a1b5-2d34fb7236d6" containerID="b51b1e2fc85b3591390d89da99ffcc0bceabcac697b4e7b5ce853341e08473cb" exitCode=0 Apr 24 19:06:30.509611 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.509553 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t4krc" event={"ID":"6b161493-342d-489a-a1b5-2d34fb7236d6","Type":"ContainerDied","Data":"b51b1e2fc85b3591390d89da99ffcc0bceabcac697b4e7b5ce853341e08473cb"} Apr 24 19:06:30.509611 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.509603 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wv5cl" Apr 24 19:06:30.509690 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.509661 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 19:06:30.509772 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:30.509690 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wv5cl" podUID="7e7fa504-1a33-465d-aa64-5131b6adcc9f" Apr 24 19:06:30.547047 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:30.547009 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-b5mbc" podStartSLOduration=1.5469975470000001 podStartE2EDuration="1.546997547s" podCreationTimestamp="2026-04-24 19:06:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:06:30.523781617 +0000 UTC m=+27.641122020" watchObservedRunningTime="2026-04-24 19:06:30.546997547 +0000 UTC m=+27.664337912" Apr 24 19:06:31.009613 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:31.009581 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7e7fa504-1a33-465d-aa64-5131b6adcc9f-original-pull-secret\") pod \"global-pull-secret-syncer-wv5cl\" (UID: \"7e7fa504-1a33-465d-aa64-5131b6adcc9f\") " pod="kube-system/global-pull-secret-syncer-wv5cl" Apr 24 19:06:31.009803 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:31.009702 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:31.009803 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:31.009763 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e7fa504-1a33-465d-aa64-5131b6adcc9f-original-pull-secret podName:7e7fa504-1a33-465d-aa64-5131b6adcc9f nodeName:}" failed. No retries permitted until 2026-04-24 19:06:32.009747694 +0000 UTC m=+29.127088065 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7e7fa504-1a33-465d-aa64-5131b6adcc9f-original-pull-secret") pod "global-pull-secret-syncer-wv5cl" (UID: "7e7fa504-1a33-465d-aa64-5131b6adcc9f") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:31.513690 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:31.513652 2573 generic.go:358] "Generic (PLEG): container finished" podID="6b161493-342d-489a-a1b5-2d34fb7236d6" containerID="dbc70a0fe4ebf0796ab085a567738e8aab97a155883a4fda50a48c0e140201a6" exitCode=0 Apr 24 19:06:31.514151 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:31.513749 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t4krc" event={"ID":"6b161493-342d-489a-a1b5-2d34fb7236d6","Type":"ContainerDied","Data":"dbc70a0fe4ebf0796ab085a567738e8aab97a155883a4fda50a48c0e140201a6"} Apr 24 19:06:31.514151 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:31.513854 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 19:06:32.016994 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:32.016956 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7e7fa504-1a33-465d-aa64-5131b6adcc9f-original-pull-secret\") pod \"global-pull-secret-syncer-wv5cl\" (UID: \"7e7fa504-1a33-465d-aa64-5131b6adcc9f\") " pod="kube-system/global-pull-secret-syncer-wv5cl" Apr 24 19:06:32.017182 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:32.017032 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:32.017182 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:32.017107 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e7fa504-1a33-465d-aa64-5131b6adcc9f-original-pull-secret podName:7e7fa504-1a33-465d-aa64-5131b6adcc9f nodeName:}" failed. No retries permitted until 2026-04-24 19:06:34.01708511 +0000 UTC m=+31.134425476 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7e7fa504-1a33-465d-aa64-5131b6adcc9f-original-pull-secret") pod "global-pull-secret-syncer-wv5cl" (UID: "7e7fa504-1a33-465d-aa64-5131b6adcc9f") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:32.375375 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:32.375309 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:32.375544 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:32.375311 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:32.375544 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:32.375417 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nrb24" podUID="f5236623-3273-4733-a194-9bfd58303272" Apr 24 19:06:32.375544 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:32.375311 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wv5cl" Apr 24 19:06:32.375690 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:32.375566 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tf94j" podUID="b0872aa7-303f-4052-9d68-dd136609293b" Apr 24 19:06:32.375690 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:32.375627 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wv5cl" podUID="7e7fa504-1a33-465d-aa64-5131b6adcc9f" Apr 24 19:06:34.031358 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:34.031326 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7e7fa504-1a33-465d-aa64-5131b6adcc9f-original-pull-secret\") pod \"global-pull-secret-syncer-wv5cl\" (UID: \"7e7fa504-1a33-465d-aa64-5131b6adcc9f\") " pod="kube-system/global-pull-secret-syncer-wv5cl" Apr 24 19:06:34.032036 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:34.031490 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:34.032036 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:34.031550 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e7fa504-1a33-465d-aa64-5131b6adcc9f-original-pull-secret podName:7e7fa504-1a33-465d-aa64-5131b6adcc9f nodeName:}" failed. No retries permitted until 2026-04-24 19:06:38.031535308 +0000 UTC m=+35.148875674 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7e7fa504-1a33-465d-aa64-5131b6adcc9f-original-pull-secret") pod "global-pull-secret-syncer-wv5cl" (UID: "7e7fa504-1a33-465d-aa64-5131b6adcc9f") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:34.375937 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:34.375659 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wv5cl" Apr 24 19:06:34.376082 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:34.375674 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:34.376082 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:34.375998 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wv5cl" podUID="7e7fa504-1a33-465d-aa64-5131b6adcc9f" Apr 24 19:06:34.376082 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:34.375696 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:34.376082 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:34.376066 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tf94j" podUID="b0872aa7-303f-4052-9d68-dd136609293b" Apr 24 19:06:34.376286 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:34.376140 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nrb24" podUID="f5236623-3273-4733-a194-9bfd58303272" Apr 24 19:06:35.336918 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:35.336882 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:35.337288 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:35.337077 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 19:06:35.357820 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:35.357794 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wt2vz" Apr 24 19:06:36.185550 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.185467 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-6.ec2.internal" event="NodeReady" Apr 24 19:06:36.185694 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.185626 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 19:06:36.228948 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.228914 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5d98d9599b-vk284"] Apr 24 19:06:36.255250 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.255215 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6d2pw"] Apr 24 19:06:36.255398 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.255359 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.258611 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.258584 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jb7lz\"" Apr 24 19:06:36.260476 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.260455 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 19:06:36.260716 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.260699 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 19:06:36.260877 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.260861 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 19:06:36.267843 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.267821 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 19:06:36.281405 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.281380 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d98d9599b-vk284"] Apr 24 19:06:36.281539 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.281410 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5sq4k"] Apr 24 19:06:36.281605 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.281545 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6d2pw" Apr 24 19:06:36.284177 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.284159 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 19:06:36.284275 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.284164 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 19:06:36.284275 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.284257 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-km7hc\"" Apr 24 19:06:36.297064 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.297042 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6d2pw"] Apr 24 19:06:36.297064 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.297067 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5sq4k"] Apr 24 19:06:36.297237 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.297152 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5sq4k" Apr 24 19:06:36.300327 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.300305 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 19:06:36.300660 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.300640 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jvjrt\"" Apr 24 19:06:36.300909 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.300695 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 19:06:36.301807 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.301786 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 19:06:36.352387 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.352350 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/68ba56d1-dd32-46c1-8484-b4074baf3f3f-metrics-tls\") pod \"dns-default-6d2pw\" (UID: \"68ba56d1-dd32-46c1-8484-b4074baf3f3f\") " pod="openshift-dns/dns-default-6d2pw" Apr 24 19:06:36.352772 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.352396 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68ba56d1-dd32-46c1-8484-b4074baf3f3f-config-volume\") pod \"dns-default-6d2pw\" (UID: \"68ba56d1-dd32-46c1-8484-b4074baf3f3f\") " pod="openshift-dns/dns-default-6d2pw" Apr 24 19:06:36.352772 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.352419 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76l6q\" (UniqueName: \"kubernetes.io/projected/68ba56d1-dd32-46c1-8484-b4074baf3f3f-kube-api-access-76l6q\") pod \"dns-default-6d2pw\" (UID: \"68ba56d1-dd32-46c1-8484-b4074baf3f3f\") " pod="openshift-dns/dns-default-6d2pw" Apr 24 19:06:36.352772 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.352486 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a910a9ba-6231-47b2-bd86-0c055b1cba96-ca-trust-extracted\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.352772 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.352520 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-certificates\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.352772 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.352541 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a910a9ba-6231-47b2-bd86-0c055b1cba96-installation-pull-secrets\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.352772 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.352564 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-tls\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.352772 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.352594 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-bound-sa-token\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.352772 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.352646 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a910a9ba-6231-47b2-bd86-0c055b1cba96-trusted-ca\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.352772 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.352674 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tntqf\" (UniqueName: \"kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-kube-api-access-tntqf\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.352772 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.352713 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/68ba56d1-dd32-46c1-8484-b4074baf3f3f-tmp-dir\") pod \"dns-default-6d2pw\" (UID: \"68ba56d1-dd32-46c1-8484-b4074baf3f3f\") " pod="openshift-dns/dns-default-6d2pw" Apr 24 19:06:36.352772 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.352737 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a910a9ba-6231-47b2-bd86-0c055b1cba96-image-registry-private-configuration\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.375660 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.375632 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wv5cl" Apr 24 19:06:36.375775 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.375662 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:36.375851 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.375786 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:36.378618 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.378582 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 19:06:36.378618 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.378605 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vpn2d\"" Apr 24 19:06:36.378764 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.378649 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 19:06:36.378889 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.378866 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5l888\"" Apr 24 19:06:36.379078 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.379061 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 19:06:36.379269 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.379254 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 19:06:36.453175 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.453084 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/68ba56d1-dd32-46c1-8484-b4074baf3f3f-metrics-tls\") pod \"dns-default-6d2pw\" (UID: \"68ba56d1-dd32-46c1-8484-b4074baf3f3f\") " pod="openshift-dns/dns-default-6d2pw" Apr 24 19:06:36.453175 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.453155 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68ba56d1-dd32-46c1-8484-b4074baf3f3f-config-volume\") pod \"dns-default-6d2pw\" (UID: \"68ba56d1-dd32-46c1-8484-b4074baf3f3f\") " pod="openshift-dns/dns-default-6d2pw" Apr 24 19:06:36.453410 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.453183 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76l6q\" (UniqueName: \"kubernetes.io/projected/68ba56d1-dd32-46c1-8484-b4074baf3f3f-kube-api-access-76l6q\") pod \"dns-default-6d2pw\" (UID: \"68ba56d1-dd32-46c1-8484-b4074baf3f3f\") " pod="openshift-dns/dns-default-6d2pw" Apr 24 19:06:36.453410 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.453205 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a910a9ba-6231-47b2-bd86-0c055b1cba96-ca-trust-extracted\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.453410 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.453228 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-certificates\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.453410 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.453257 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a910a9ba-6231-47b2-bd86-0c055b1cba96-installation-pull-secrets\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.453410 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:36.453271 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:06:36.453410 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.453285 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-tls\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.453410 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.453313 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eea44eed-90c5-4bbc-b836-55ef49678cf3-cert\") pod \"ingress-canary-5sq4k\" (UID: \"eea44eed-90c5-4bbc-b836-55ef49678cf3\") " pod="openshift-ingress-canary/ingress-canary-5sq4k" Apr 24 19:06:36.453410 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:36.453347 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68ba56d1-dd32-46c1-8484-b4074baf3f3f-metrics-tls podName:68ba56d1-dd32-46c1-8484-b4074baf3f3f nodeName:}" failed. No retries permitted until 2026-04-24 19:06:36.953327561 +0000 UTC m=+34.070667943 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/68ba56d1-dd32-46c1-8484-b4074baf3f3f-metrics-tls") pod "dns-default-6d2pw" (UID: "68ba56d1-dd32-46c1-8484-b4074baf3f3f") : secret "dns-default-metrics-tls" not found Apr 24 19:06:36.453410 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.453390 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-bound-sa-token\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.453854 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.453420 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw886\" (UniqueName: \"kubernetes.io/projected/eea44eed-90c5-4bbc-b836-55ef49678cf3-kube-api-access-hw886\") pod \"ingress-canary-5sq4k\" (UID: \"eea44eed-90c5-4bbc-b836-55ef49678cf3\") " pod="openshift-ingress-canary/ingress-canary-5sq4k" Apr 24 19:06:36.453854 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.453500 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a910a9ba-6231-47b2-bd86-0c055b1cba96-trusted-ca\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.453854 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.453523 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tntqf\" (UniqueName: \"kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-kube-api-access-tntqf\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.453854 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.453551 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/68ba56d1-dd32-46c1-8484-b4074baf3f3f-tmp-dir\") pod \"dns-default-6d2pw\" (UID: \"68ba56d1-dd32-46c1-8484-b4074baf3f3f\") " pod="openshift-dns/dns-default-6d2pw" Apr 24 19:06:36.453854 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.453577 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a910a9ba-6231-47b2-bd86-0c055b1cba96-image-registry-private-configuration\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.453854 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:36.453625 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:06:36.453854 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:36.453639 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d98d9599b-vk284: secret "image-registry-tls" not found Apr 24 19:06:36.453854 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:36.453713 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-tls podName:a910a9ba-6231-47b2-bd86-0c055b1cba96 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:36.953696789 +0000 UTC m=+34.071037155 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-tls") pod "image-registry-5d98d9599b-vk284" (UID: "a910a9ba-6231-47b2-bd86-0c055b1cba96") : secret "image-registry-tls" not found Apr 24 19:06:36.453854 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.453719 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a910a9ba-6231-47b2-bd86-0c055b1cba96-ca-trust-extracted\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.453854 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.453845 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68ba56d1-dd32-46c1-8484-b4074baf3f3f-config-volume\") pod \"dns-default-6d2pw\" (UID: \"68ba56d1-dd32-46c1-8484-b4074baf3f3f\") " pod="openshift-dns/dns-default-6d2pw" Apr 24 19:06:36.454329 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.454269 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/68ba56d1-dd32-46c1-8484-b4074baf3f3f-tmp-dir\") pod \"dns-default-6d2pw\" (UID: \"68ba56d1-dd32-46c1-8484-b4074baf3f3f\") " pod="openshift-dns/dns-default-6d2pw" Apr 24 19:06:36.454382 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.454339 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-certificates\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.454706 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.454689 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a910a9ba-6231-47b2-bd86-0c055b1cba96-trusted-ca\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.457769 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.457749 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a910a9ba-6231-47b2-bd86-0c055b1cba96-image-registry-private-configuration\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.458085 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.458068 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a910a9ba-6231-47b2-bd86-0c055b1cba96-installation-pull-secrets\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.473678 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.473652 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-bound-sa-token\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.473815 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.473695 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76l6q\" (UniqueName: \"kubernetes.io/projected/68ba56d1-dd32-46c1-8484-b4074baf3f3f-kube-api-access-76l6q\") pod \"dns-default-6d2pw\" (UID: \"68ba56d1-dd32-46c1-8484-b4074baf3f3f\") " pod="openshift-dns/dns-default-6d2pw" Apr 24 19:06:36.475117 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.475090 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tntqf\" (UniqueName: \"kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-kube-api-access-tntqf\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.554882 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.554847 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eea44eed-90c5-4bbc-b836-55ef49678cf3-cert\") pod \"ingress-canary-5sq4k\" (UID: \"eea44eed-90c5-4bbc-b836-55ef49678cf3\") " pod="openshift-ingress-canary/ingress-canary-5sq4k" Apr 24 19:06:36.555055 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:36.554973 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:06:36.555111 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:36.555063 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eea44eed-90c5-4bbc-b836-55ef49678cf3-cert podName:eea44eed-90c5-4bbc-b836-55ef49678cf3 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:37.055031521 +0000 UTC m=+34.172371892 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eea44eed-90c5-4bbc-b836-55ef49678cf3-cert") pod "ingress-canary-5sq4k" (UID: "eea44eed-90c5-4bbc-b836-55ef49678cf3") : secret "canary-serving-cert" not found Apr 24 19:06:36.555111 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.555092 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hw886\" (UniqueName: \"kubernetes.io/projected/eea44eed-90c5-4bbc-b836-55ef49678cf3-kube-api-access-hw886\") pod \"ingress-canary-5sq4k\" (UID: \"eea44eed-90c5-4bbc-b836-55ef49678cf3\") " pod="openshift-ingress-canary/ingress-canary-5sq4k" Apr 24 19:06:36.569529 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.569506 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw886\" (UniqueName: \"kubernetes.io/projected/eea44eed-90c5-4bbc-b836-55ef49678cf3-kube-api-access-hw886\") pod \"ingress-canary-5sq4k\" (UID: \"eea44eed-90c5-4bbc-b836-55ef49678cf3\") " pod="openshift-ingress-canary/ingress-canary-5sq4k" Apr 24 19:06:36.958798 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.958757 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-tls\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:36.958989 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:36.958890 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/68ba56d1-dd32-46c1-8484-b4074baf3f3f-metrics-tls\") pod \"dns-default-6d2pw\" (UID: \"68ba56d1-dd32-46c1-8484-b4074baf3f3f\") " pod="openshift-dns/dns-default-6d2pw" Apr 24 19:06:36.958989 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:36.958920 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:06:36.958989 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:36.958943 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d98d9599b-vk284: secret "image-registry-tls" not found Apr 24 19:06:36.959146 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:36.959010 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:06:36.959146 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:36.959015 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-tls podName:a910a9ba-6231-47b2-bd86-0c055b1cba96 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:37.958991431 +0000 UTC m=+35.076331811 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-tls") pod "image-registry-5d98d9599b-vk284" (UID: "a910a9ba-6231-47b2-bd86-0c055b1cba96") : secret "image-registry-tls" not found Apr 24 19:06:36.959146 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:36.959069 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68ba56d1-dd32-46c1-8484-b4074baf3f3f-metrics-tls podName:68ba56d1-dd32-46c1-8484-b4074baf3f3f nodeName:}" failed. No retries permitted until 2026-04-24 19:06:37.959053716 +0000 UTC m=+35.076394086 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/68ba56d1-dd32-46c1-8484-b4074baf3f3f-metrics-tls") pod "dns-default-6d2pw" (UID: "68ba56d1-dd32-46c1-8484-b4074baf3f3f") : secret "dns-default-metrics-tls" not found Apr 24 19:06:37.060095 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:37.060062 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eea44eed-90c5-4bbc-b836-55ef49678cf3-cert\") pod \"ingress-canary-5sq4k\" (UID: \"eea44eed-90c5-4bbc-b836-55ef49678cf3\") " pod="openshift-ingress-canary/ingress-canary-5sq4k" Apr 24 19:06:37.060268 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:37.060110 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs\") pod \"network-metrics-daemon-tf94j\" (UID: \"b0872aa7-303f-4052-9d68-dd136609293b\") " pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:06:37.060268 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:37.060227 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 19:06:37.060268 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:37.060230 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:06:37.060402 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:37.060286 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs podName:b0872aa7-303f-4052-9d68-dd136609293b nodeName:}" failed. No retries permitted until 2026-04-24 19:07:09.060270428 +0000 UTC m=+66.177610798 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs") pod "network-metrics-daemon-tf94j" (UID: "b0872aa7-303f-4052-9d68-dd136609293b") : secret "metrics-daemon-secret" not found Apr 24 19:06:37.060402 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:37.060305 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eea44eed-90c5-4bbc-b836-55ef49678cf3-cert podName:eea44eed-90c5-4bbc-b836-55ef49678cf3 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:38.060295726 +0000 UTC m=+35.177636094 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eea44eed-90c5-4bbc-b836-55ef49678cf3-cert") pod "ingress-canary-5sq4k" (UID: "eea44eed-90c5-4bbc-b836-55ef49678cf3") : secret "canary-serving-cert" not found Apr 24 19:06:37.161128 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:37.161096 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddg5p\" (UniqueName: \"kubernetes.io/projected/f5236623-3273-4733-a194-9bfd58303272-kube-api-access-ddg5p\") pod \"network-check-target-nrb24\" (UID: \"f5236623-3273-4733-a194-9bfd58303272\") " pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:37.163608 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:37.163591 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddg5p\" (UniqueName: \"kubernetes.io/projected/f5236623-3273-4733-a194-9bfd58303272-kube-api-access-ddg5p\") pod \"network-check-target-nrb24\" (UID: \"f5236623-3273-4733-a194-9bfd58303272\") " pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:37.299971 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:37.299944 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:37.468340 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:37.468308 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-nrb24"] Apr 24 19:06:37.522198 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:37.522164 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5236623_3273_4733_a194_9bfd58303272.slice/crio-f4f58ccf9afceac9b6263a9a5874dde52d19035109d99c7076c398247b683b9d WatchSource:0}: Error finding container f4f58ccf9afceac9b6263a9a5874dde52d19035109d99c7076c398247b683b9d: Status 404 returned error can't find the container with id f4f58ccf9afceac9b6263a9a5874dde52d19035109d99c7076c398247b683b9d Apr 24 19:06:37.526002 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:37.525975 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-nrb24" event={"ID":"f5236623-3273-4733-a194-9bfd58303272","Type":"ContainerStarted","Data":"f4f58ccf9afceac9b6263a9a5874dde52d19035109d99c7076c398247b683b9d"} Apr 24 19:06:37.968687 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:37.968492 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/68ba56d1-dd32-46c1-8484-b4074baf3f3f-metrics-tls\") pod \"dns-default-6d2pw\" (UID: \"68ba56d1-dd32-46c1-8484-b4074baf3f3f\") " pod="openshift-dns/dns-default-6d2pw" Apr 24 19:06:37.968840 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:37.968731 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-tls\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:37.968840 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:37.968664 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:06:37.968840 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:37.968834 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:06:37.969006 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:37.968849 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d98d9599b-vk284: secret "image-registry-tls" not found Apr 24 19:06:37.969006 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:37.968836 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68ba56d1-dd32-46c1-8484-b4074baf3f3f-metrics-tls podName:68ba56d1-dd32-46c1-8484-b4074baf3f3f nodeName:}" failed. No retries permitted until 2026-04-24 19:06:39.968815448 +0000 UTC m=+37.086155818 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/68ba56d1-dd32-46c1-8484-b4074baf3f3f-metrics-tls") pod "dns-default-6d2pw" (UID: "68ba56d1-dd32-46c1-8484-b4074baf3f3f") : secret "dns-default-metrics-tls" not found Apr 24 19:06:37.969006 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:37.968905 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-tls podName:a910a9ba-6231-47b2-bd86-0c055b1cba96 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:39.968889889 +0000 UTC m=+37.086230255 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-tls") pod "image-registry-5d98d9599b-vk284" (UID: "a910a9ba-6231-47b2-bd86-0c055b1cba96") : secret "image-registry-tls" not found Apr 24 19:06:38.069597 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:38.069555 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eea44eed-90c5-4bbc-b836-55ef49678cf3-cert\") pod \"ingress-canary-5sq4k\" (UID: \"eea44eed-90c5-4bbc-b836-55ef49678cf3\") " pod="openshift-ingress-canary/ingress-canary-5sq4k" Apr 24 19:06:38.069772 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:38.069626 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7e7fa504-1a33-465d-aa64-5131b6adcc9f-original-pull-secret\") pod \"global-pull-secret-syncer-wv5cl\" (UID: \"7e7fa504-1a33-465d-aa64-5131b6adcc9f\") " pod="kube-system/global-pull-secret-syncer-wv5cl" Apr 24 19:06:38.069772 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:38.069679 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:06:38.069772 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:38.069750 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eea44eed-90c5-4bbc-b836-55ef49678cf3-cert podName:eea44eed-90c5-4bbc-b836-55ef49678cf3 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:40.069731413 +0000 UTC m=+37.187071783 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eea44eed-90c5-4bbc-b836-55ef49678cf3-cert") pod "ingress-canary-5sq4k" (UID: "eea44eed-90c5-4bbc-b836-55ef49678cf3") : secret "canary-serving-cert" not found Apr 24 19:06:38.073602 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:38.073576 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7e7fa504-1a33-465d-aa64-5131b6adcc9f-original-pull-secret\") pod \"global-pull-secret-syncer-wv5cl\" (UID: \"7e7fa504-1a33-465d-aa64-5131b6adcc9f\") " pod="kube-system/global-pull-secret-syncer-wv5cl" Apr 24 19:06:38.186941 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:38.186868 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wv5cl" Apr 24 19:06:38.344052 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:38.344022 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wv5cl"] Apr 24 19:06:38.347600 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:38.347573 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e7fa504_1a33_465d_aa64_5131b6adcc9f.slice/crio-047b4ba214e72d324b4435c6e23bef652fff96cd6c70522733c2cab512a4ba1b WatchSource:0}: Error finding container 047b4ba214e72d324b4435c6e23bef652fff96cd6c70522733c2cab512a4ba1b: Status 404 returned error can't find the container with id 047b4ba214e72d324b4435c6e23bef652fff96cd6c70522733c2cab512a4ba1b Apr 24 19:06:38.531181 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:38.531144 2573 generic.go:358] "Generic (PLEG): container finished" podID="6b161493-342d-489a-a1b5-2d34fb7236d6" containerID="d3daed96b922065919007410ba6b1aed05690ae4b77ecbfc532b0f38244b3937" exitCode=0 Apr 24 19:06:38.531799 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:38.531220 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t4krc" event={"ID":"6b161493-342d-489a-a1b5-2d34fb7236d6","Type":"ContainerDied","Data":"d3daed96b922065919007410ba6b1aed05690ae4b77ecbfc532b0f38244b3937"} Apr 24 19:06:38.532414 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:38.532387 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wv5cl" event={"ID":"7e7fa504-1a33-465d-aa64-5131b6adcc9f","Type":"ContainerStarted","Data":"047b4ba214e72d324b4435c6e23bef652fff96cd6c70522733c2cab512a4ba1b"} Apr 24 19:06:38.652459 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:38.652341 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-bnvtx"] Apr 24 19:06:38.673449 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:38.673408 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-bnvtx"] Apr 24 19:06:38.673599 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:38.673580 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bnvtx" Apr 24 19:06:38.676573 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:38.676523 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 19:06:38.676573 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:38.676528 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 19:06:38.676940 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:38.676921 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-2jh7z\"" Apr 24 19:06:38.775161 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:38.775132 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-bnvtx\" (UID: \"9ab7016b-4eb4-436f-945f-9e7f777cdd5a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bnvtx" Apr 24 19:06:38.775298 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:38.775167 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bnvtx\" (UID: \"9ab7016b-4eb4-436f-945f-9e7f777cdd5a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bnvtx" Apr 24 19:06:38.876766 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:38.876687 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-bnvtx\" (UID: \"9ab7016b-4eb4-436f-945f-9e7f777cdd5a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bnvtx" Apr 24 19:06:38.876766 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:38.876734 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bnvtx\" (UID: \"9ab7016b-4eb4-436f-945f-9e7f777cdd5a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bnvtx" Apr 24 19:06:38.876995 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:38.876881 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 19:06:38.877110 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:38.877087 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-networking-console-plugin-cert podName:9ab7016b-4eb4-436f-945f-9e7f777cdd5a nodeName:}" failed. No retries permitted until 2026-04-24 19:06:39.376931058 +0000 UTC m=+36.494271424 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bnvtx" (UID: "9ab7016b-4eb4-436f-945f-9e7f777cdd5a") : secret "networking-console-plugin-cert" not found Apr 24 19:06:38.887517 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:38.887486 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-bnvtx\" (UID: \"9ab7016b-4eb4-436f-945f-9e7f777cdd5a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bnvtx" Apr 24 19:06:39.381069 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:39.381038 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bnvtx\" (UID: \"9ab7016b-4eb4-436f-945f-9e7f777cdd5a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bnvtx" Apr 24 19:06:39.381274 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:39.381165 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 19:06:39.381274 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:39.381218 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-networking-console-plugin-cert podName:9ab7016b-4eb4-436f-945f-9e7f777cdd5a nodeName:}" failed. No retries permitted until 2026-04-24 19:06:40.381204626 +0000 UTC m=+37.498544997 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bnvtx" (UID: "9ab7016b-4eb4-436f-945f-9e7f777cdd5a") : secret "networking-console-plugin-cert" not found Apr 24 19:06:39.549108 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:39.549023 2573 generic.go:358] "Generic (PLEG): container finished" podID="6b161493-342d-489a-a1b5-2d34fb7236d6" containerID="0c5a3910f707b3300634cb1156b05e7bb8afdd5e8f5bca87cb6eb28a6d257287" exitCode=0 Apr 24 19:06:39.549108 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:39.549072 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t4krc" event={"ID":"6b161493-342d-489a-a1b5-2d34fb7236d6","Type":"ContainerDied","Data":"0c5a3910f707b3300634cb1156b05e7bb8afdd5e8f5bca87cb6eb28a6d257287"} Apr 24 19:06:39.986129 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:39.986096 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-tls\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:39.986296 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:39.986205 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/68ba56d1-dd32-46c1-8484-b4074baf3f3f-metrics-tls\") pod \"dns-default-6d2pw\" (UID: \"68ba56d1-dd32-46c1-8484-b4074baf3f3f\") " pod="openshift-dns/dns-default-6d2pw" Apr 24 19:06:39.986371 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:39.986342 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:06:39.986424 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:39.986404 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68ba56d1-dd32-46c1-8484-b4074baf3f3f-metrics-tls podName:68ba56d1-dd32-46c1-8484-b4074baf3f3f nodeName:}" failed. No retries permitted until 2026-04-24 19:06:43.986385237 +0000 UTC m=+41.103725628 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/68ba56d1-dd32-46c1-8484-b4074baf3f3f-metrics-tls") pod "dns-default-6d2pw" (UID: "68ba56d1-dd32-46c1-8484-b4074baf3f3f") : secret "dns-default-metrics-tls" not found Apr 24 19:06:39.986622 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:39.986598 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:06:39.986622 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:39.986619 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d98d9599b-vk284: secret "image-registry-tls" not found Apr 24 19:06:39.986806 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:39.986668 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-tls podName:a910a9ba-6231-47b2-bd86-0c055b1cba96 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:43.986653691 +0000 UTC m=+41.103994070 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-tls") pod "image-registry-5d98d9599b-vk284" (UID: "a910a9ba-6231-47b2-bd86-0c055b1cba96") : secret "image-registry-tls" not found Apr 24 19:06:40.087674 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:40.087640 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eea44eed-90c5-4bbc-b836-55ef49678cf3-cert\") pod \"ingress-canary-5sq4k\" (UID: \"eea44eed-90c5-4bbc-b836-55ef49678cf3\") " pod="openshift-ingress-canary/ingress-canary-5sq4k" Apr 24 19:06:40.087841 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:40.087760 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:06:40.087841 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:40.087820 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eea44eed-90c5-4bbc-b836-55ef49678cf3-cert podName:eea44eed-90c5-4bbc-b836-55ef49678cf3 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:44.087806454 +0000 UTC m=+41.205146820 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eea44eed-90c5-4bbc-b836-55ef49678cf3-cert") pod "ingress-canary-5sq4k" (UID: "eea44eed-90c5-4bbc-b836-55ef49678cf3") : secret "canary-serving-cert" not found Apr 24 19:06:40.390045 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:40.389873 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bnvtx\" (UID: \"9ab7016b-4eb4-436f-945f-9e7f777cdd5a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bnvtx" Apr 24 19:06:40.390045 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:40.390036 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 19:06:40.390239 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:40.390113 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-networking-console-plugin-cert podName:9ab7016b-4eb4-436f-945f-9e7f777cdd5a nodeName:}" failed. No retries permitted until 2026-04-24 19:06:42.390091553 +0000 UTC m=+39.507431919 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bnvtx" (UID: "9ab7016b-4eb4-436f-945f-9e7f777cdd5a") : secret "networking-console-plugin-cert" not found Apr 24 19:06:41.554418 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:41.554010 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-nrb24" event={"ID":"f5236623-3273-4733-a194-9bfd58303272","Type":"ContainerStarted","Data":"fbf6c8be1673a92a8b4d99642d8e362e967ecf1d2bfb4b01973ef6b846701c93"} Apr 24 19:06:41.554418 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:41.554229 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:06:41.557385 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:41.557361 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t4krc" event={"ID":"6b161493-342d-489a-a1b5-2d34fb7236d6","Type":"ContainerStarted","Data":"4b2865e3a12f0d3650d2405193ae9c80cccba03f32a3b9ea0b9bf1a0dc49c9c3"} Apr 24 19:06:41.571222 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:41.571182 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-nrb24" podStartSLOduration=35.433410166 podStartE2EDuration="38.571168499s" podCreationTimestamp="2026-04-24 19:06:03 +0000 UTC" firstStartedPulling="2026-04-24 19:06:37.550228172 +0000 UTC m=+34.667568539" lastFinishedPulling="2026-04-24 19:06:40.6879865 +0000 UTC m=+37.805326872" observedRunningTime="2026-04-24 19:06:41.569488466 +0000 UTC m=+38.686828855" watchObservedRunningTime="2026-04-24 19:06:41.571168499 +0000 UTC m=+38.688508887" Apr 24 19:06:41.594579 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:41.594104 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-t4krc" podStartSLOduration=6.928446381 podStartE2EDuration="38.594088233s" podCreationTimestamp="2026-04-24 19:06:03 +0000 UTC" firstStartedPulling="2026-04-24 19:06:05.908592768 +0000 UTC m=+3.025933133" lastFinishedPulling="2026-04-24 19:06:37.57423462 +0000 UTC m=+34.691574985" observedRunningTime="2026-04-24 19:06:41.593124589 +0000 UTC m=+38.710464989" watchObservedRunningTime="2026-04-24 19:06:41.594088233 +0000 UTC m=+38.711428625" Apr 24 19:06:42.404353 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:42.404317 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bnvtx\" (UID: \"9ab7016b-4eb4-436f-945f-9e7f777cdd5a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bnvtx" Apr 24 19:06:42.404541 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:42.404498 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 19:06:42.404591 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:42.404565 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-networking-console-plugin-cert podName:9ab7016b-4eb4-436f-945f-9e7f777cdd5a nodeName:}" failed. No retries permitted until 2026-04-24 19:06:46.404548578 +0000 UTC m=+43.521888947 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bnvtx" (UID: "9ab7016b-4eb4-436f-945f-9e7f777cdd5a") : secret "networking-console-plugin-cert" not found Apr 24 19:06:43.563261 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:43.563215 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wv5cl" event={"ID":"7e7fa504-1a33-465d-aa64-5131b6adcc9f","Type":"ContainerStarted","Data":"0026e0491d5b9bc44d0313a52b36b56cb7d30ce69c0f4d7f6345b911c2413f83"} Apr 24 19:06:43.578510 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:43.578464 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-wv5cl" podStartSLOduration=9.423009749 podStartE2EDuration="13.578450545s" podCreationTimestamp="2026-04-24 19:06:30 +0000 UTC" firstStartedPulling="2026-04-24 19:06:38.349602868 +0000 UTC m=+35.466943235" lastFinishedPulling="2026-04-24 19:06:42.505043665 +0000 UTC m=+39.622384031" observedRunningTime="2026-04-24 19:06:43.578199121 +0000 UTC m=+40.695539511" watchObservedRunningTime="2026-04-24 19:06:43.578450545 +0000 UTC m=+40.695790926" Apr 24 19:06:44.014051 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:44.014022 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/68ba56d1-dd32-46c1-8484-b4074baf3f3f-metrics-tls\") pod \"dns-default-6d2pw\" (UID: \"68ba56d1-dd32-46c1-8484-b4074baf3f3f\") " pod="openshift-dns/dns-default-6d2pw" Apr 24 19:06:44.014230 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:44.014066 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-tls\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:44.014230 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:44.014173 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:06:44.014230 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:44.014184 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d98d9599b-vk284: secret "image-registry-tls" not found Apr 24 19:06:44.014230 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:44.014184 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:06:44.014378 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:44.014251 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-tls podName:a910a9ba-6231-47b2-bd86-0c055b1cba96 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:52.014219148 +0000 UTC m=+49.131559518 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-tls") pod "image-registry-5d98d9599b-vk284" (UID: "a910a9ba-6231-47b2-bd86-0c055b1cba96") : secret "image-registry-tls" not found Apr 24 19:06:44.014378 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:44.014264 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68ba56d1-dd32-46c1-8484-b4074baf3f3f-metrics-tls podName:68ba56d1-dd32-46c1-8484-b4074baf3f3f nodeName:}" failed. No retries permitted until 2026-04-24 19:06:52.014257511 +0000 UTC m=+49.131597877 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/68ba56d1-dd32-46c1-8484-b4074baf3f3f-metrics-tls") pod "dns-default-6d2pw" (UID: "68ba56d1-dd32-46c1-8484-b4074baf3f3f") : secret "dns-default-metrics-tls" not found Apr 24 19:06:44.115027 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:44.114988 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eea44eed-90c5-4bbc-b836-55ef49678cf3-cert\") pod \"ingress-canary-5sq4k\" (UID: \"eea44eed-90c5-4bbc-b836-55ef49678cf3\") " pod="openshift-ingress-canary/ingress-canary-5sq4k" Apr 24 19:06:44.115169 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:44.115148 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:06:44.115248 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:44.115231 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eea44eed-90c5-4bbc-b836-55ef49678cf3-cert podName:eea44eed-90c5-4bbc-b836-55ef49678cf3 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:52.115210772 +0000 UTC m=+49.232551138 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eea44eed-90c5-4bbc-b836-55ef49678cf3-cert") pod "ingress-canary-5sq4k" (UID: "eea44eed-90c5-4bbc-b836-55ef49678cf3") : secret "canary-serving-cert" not found Apr 24 19:06:45.436891 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:45.436863 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-sspcx"] Apr 24 19:06:45.460565 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:45.460542 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-sspcx"] Apr 24 19:06:45.460692 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:45.460653 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sspcx" Apr 24 19:06:45.464347 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:45.464323 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-q7xxj\"" Apr 24 19:06:45.465558 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:45.465538 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 19:06:45.465674 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:45.465556 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 19:06:45.627316 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:45.627287 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8lkv\" (UniqueName: \"kubernetes.io/projected/10b3548c-86e6-44e8-9141-042fa481976e-kube-api-access-t8lkv\") pod \"migrator-74bb7799d9-sspcx\" (UID: \"10b3548c-86e6-44e8-9141-042fa481976e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sspcx" Apr 24 19:06:45.727829 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:45.727750 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8lkv\" (UniqueName: \"kubernetes.io/projected/10b3548c-86e6-44e8-9141-042fa481976e-kube-api-access-t8lkv\") pod \"migrator-74bb7799d9-sspcx\" (UID: \"10b3548c-86e6-44e8-9141-042fa481976e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sspcx" Apr 24 19:06:45.741333 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:45.741303 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8lkv\" (UniqueName: \"kubernetes.io/projected/10b3548c-86e6-44e8-9141-042fa481976e-kube-api-access-t8lkv\") pod \"migrator-74bb7799d9-sspcx\" (UID: \"10b3548c-86e6-44e8-9141-042fa481976e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sspcx" Apr 24 19:06:45.769276 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:45.769253 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sspcx" Apr 24 19:06:45.889732 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:45.889701 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-sspcx"] Apr 24 19:06:45.892482 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:45.892456 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10b3548c_86e6_44e8_9141_042fa481976e.slice/crio-31294fa015db0ab9fac80e11c28c98a794c2e4836aecb6d746223840e1a6dab1 WatchSource:0}: Error finding container 31294fa015db0ab9fac80e11c28c98a794c2e4836aecb6d746223840e1a6dab1: Status 404 returned error can't find the container with id 31294fa015db0ab9fac80e11c28c98a794c2e4836aecb6d746223840e1a6dab1 Apr 24 19:06:46.270831 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:46.270801 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-b5mbc_3ae0b497-fe7c-4446-ba6a-1177ca3da41b/dns-node-resolver/0.log" Apr 24 19:06:46.431987 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:46.431952 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bnvtx\" (UID: \"9ab7016b-4eb4-436f-945f-9e7f777cdd5a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bnvtx" Apr 24 19:06:46.432143 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:46.432069 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 19:06:46.432143 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:46.432127 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-networking-console-plugin-cert podName:9ab7016b-4eb4-436f-945f-9e7f777cdd5a nodeName:}" failed. No retries permitted until 2026-04-24 19:06:54.432113943 +0000 UTC m=+51.549454311 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bnvtx" (UID: "9ab7016b-4eb4-436f-945f-9e7f777cdd5a") : secret "networking-console-plugin-cert" not found Apr 24 19:06:46.570384 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:46.570296 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sspcx" event={"ID":"10b3548c-86e6-44e8-9141-042fa481976e","Type":"ContainerStarted","Data":"31294fa015db0ab9fac80e11c28c98a794c2e4836aecb6d746223840e1a6dab1"} Apr 24 19:06:46.713290 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:46.713251 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-k29pw"] Apr 24 19:06:46.733777 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:46.733748 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-k29pw"] Apr 24 19:06:46.733918 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:46.733850 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-k29pw" Apr 24 19:06:46.736616 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:46.736594 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 19:06:46.736747 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:46.736599 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 19:06:46.737828 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:46.737799 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 19:06:46.737920 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:46.737804 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 19:06:46.738100 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:46.738083 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-hkszf\"" Apr 24 19:06:46.834660 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:46.834579 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6khsq\" (UniqueName: \"kubernetes.io/projected/1a843527-40e6-4539-9fab-298493ee1bfb-kube-api-access-6khsq\") pod \"service-ca-865cb79987-k29pw\" (UID: \"1a843527-40e6-4539-9fab-298493ee1bfb\") " pod="openshift-service-ca/service-ca-865cb79987-k29pw" Apr 24 19:06:46.834660 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:46.834641 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1a843527-40e6-4539-9fab-298493ee1bfb-signing-key\") pod \"service-ca-865cb79987-k29pw\" (UID: \"1a843527-40e6-4539-9fab-298493ee1bfb\") " pod="openshift-service-ca/service-ca-865cb79987-k29pw" Apr 24 19:06:46.834660 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:46.834659 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1a843527-40e6-4539-9fab-298493ee1bfb-signing-cabundle\") pod \"service-ca-865cb79987-k29pw\" (UID: \"1a843527-40e6-4539-9fab-298493ee1bfb\") " pod="openshift-service-ca/service-ca-865cb79987-k29pw" Apr 24 19:06:46.935941 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:46.935909 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6khsq\" (UniqueName: \"kubernetes.io/projected/1a843527-40e6-4539-9fab-298493ee1bfb-kube-api-access-6khsq\") pod \"service-ca-865cb79987-k29pw\" (UID: \"1a843527-40e6-4539-9fab-298493ee1bfb\") " pod="openshift-service-ca/service-ca-865cb79987-k29pw" Apr 24 19:06:46.936120 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:46.936054 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1a843527-40e6-4539-9fab-298493ee1bfb-signing-key\") pod \"service-ca-865cb79987-k29pw\" (UID: \"1a843527-40e6-4539-9fab-298493ee1bfb\") " pod="openshift-service-ca/service-ca-865cb79987-k29pw" Apr 24 19:06:46.936120 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:46.936088 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1a843527-40e6-4539-9fab-298493ee1bfb-signing-cabundle\") pod \"service-ca-865cb79987-k29pw\" (UID: \"1a843527-40e6-4539-9fab-298493ee1bfb\") " pod="openshift-service-ca/service-ca-865cb79987-k29pw" Apr 24 19:06:46.936895 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:46.936869 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1a843527-40e6-4539-9fab-298493ee1bfb-signing-cabundle\") pod \"service-ca-865cb79987-k29pw\" (UID: \"1a843527-40e6-4539-9fab-298493ee1bfb\") " pod="openshift-service-ca/service-ca-865cb79987-k29pw" Apr 24 19:06:46.938938 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:46.938913 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1a843527-40e6-4539-9fab-298493ee1bfb-signing-key\") pod \"service-ca-865cb79987-k29pw\" (UID: \"1a843527-40e6-4539-9fab-298493ee1bfb\") " pod="openshift-service-ca/service-ca-865cb79987-k29pw" Apr 24 19:06:46.946248 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:46.946223 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6khsq\" (UniqueName: \"kubernetes.io/projected/1a843527-40e6-4539-9fab-298493ee1bfb-kube-api-access-6khsq\") pod \"service-ca-865cb79987-k29pw\" (UID: \"1a843527-40e6-4539-9fab-298493ee1bfb\") " pod="openshift-service-ca/service-ca-865cb79987-k29pw" Apr 24 19:06:47.043245 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:47.043209 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-k29pw" Apr 24 19:06:47.181276 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:47.181247 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-k29pw"] Apr 24 19:06:47.183974 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:06:47.183953 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a843527_40e6_4539_9fab_298493ee1bfb.slice/crio-b47c6e8241f91c77ab39ff94a70a0ce883e1242334959a0c233e2e544eaad036 WatchSource:0}: Error finding container b47c6e8241f91c77ab39ff94a70a0ce883e1242334959a0c233e2e544eaad036: Status 404 returned error can't find the container with id b47c6e8241f91c77ab39ff94a70a0ce883e1242334959a0c233e2e544eaad036 Apr 24 19:06:47.572872 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:47.572839 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sspcx" event={"ID":"10b3548c-86e6-44e8-9141-042fa481976e","Type":"ContainerStarted","Data":"363f05bcb618f0ce35870c5d23f981e795e9ba4e805f284c865a3be50d671188"} Apr 24 19:06:47.573902 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:47.573880 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-k29pw" event={"ID":"1a843527-40e6-4539-9fab-298493ee1bfb","Type":"ContainerStarted","Data":"b47c6e8241f91c77ab39ff94a70a0ce883e1242334959a0c233e2e544eaad036"} Apr 24 19:06:47.681033 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:47.680901 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ntjd2_e5bcfed1-92e6-4764-9896-4a9fc77aaef9/node-ca/0.log" Apr 24 19:06:48.577706 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:48.577668 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sspcx" event={"ID":"10b3548c-86e6-44e8-9141-042fa481976e","Type":"ContainerStarted","Data":"3e3e4fb3b5d1cd65b5cab53ca7c000975d7a4976c82300b231fb64eda111efc0"} Apr 24 19:06:48.597324 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:48.597281 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sspcx" podStartSLOduration=2.068499086 podStartE2EDuration="3.59726496s" podCreationTimestamp="2026-04-24 19:06:45 +0000 UTC" firstStartedPulling="2026-04-24 19:06:45.89446957 +0000 UTC m=+43.011809938" lastFinishedPulling="2026-04-24 19:06:47.423235443 +0000 UTC m=+44.540575812" observedRunningTime="2026-04-24 19:06:48.596005681 +0000 UTC m=+45.713346068" watchObservedRunningTime="2026-04-24 19:06:48.59726496 +0000 UTC m=+45.714605350" Apr 24 19:06:49.584708 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:49.584632 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-k29pw" event={"ID":"1a843527-40e6-4539-9fab-298493ee1bfb","Type":"ContainerStarted","Data":"7cd58a69b2e5bf6aa0aed25d965223fe959f12fc28bd0e17c30459c26d206453"} Apr 24 19:06:49.602399 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:49.602357 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-k29pw" podStartSLOduration=1.487038958 podStartE2EDuration="3.602346281s" podCreationTimestamp="2026-04-24 19:06:46 +0000 UTC" firstStartedPulling="2026-04-24 19:06:47.186333521 +0000 UTC m=+44.303673886" lastFinishedPulling="2026-04-24 19:06:49.30164084 +0000 UTC m=+46.418981209" observedRunningTime="2026-04-24 19:06:49.601631711 +0000 UTC m=+46.718972099" watchObservedRunningTime="2026-04-24 19:06:49.602346281 +0000 UTC m=+46.719686668" Apr 24 19:06:52.074926 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:52.074884 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/68ba56d1-dd32-46c1-8484-b4074baf3f3f-metrics-tls\") pod \"dns-default-6d2pw\" (UID: \"68ba56d1-dd32-46c1-8484-b4074baf3f3f\") " pod="openshift-dns/dns-default-6d2pw" Apr 24 19:06:52.075306 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:52.074948 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-tls\") pod \"image-registry-5d98d9599b-vk284\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:06:52.075306 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:52.075036 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:06:52.075306 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:52.075046 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:06:52.075306 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:52.075062 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d98d9599b-vk284: secret "image-registry-tls" not found Apr 24 19:06:52.075306 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:52.075098 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68ba56d1-dd32-46c1-8484-b4074baf3f3f-metrics-tls podName:68ba56d1-dd32-46c1-8484-b4074baf3f3f nodeName:}" failed. No retries permitted until 2026-04-24 19:07:08.075082787 +0000 UTC m=+65.192423158 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/68ba56d1-dd32-46c1-8484-b4074baf3f3f-metrics-tls") pod "dns-default-6d2pw" (UID: "68ba56d1-dd32-46c1-8484-b4074baf3f3f") : secret "dns-default-metrics-tls" not found Apr 24 19:06:52.075306 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:52.075112 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-tls podName:a910a9ba-6231-47b2-bd86-0c055b1cba96 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:08.075106514 +0000 UTC m=+65.192446880 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-tls") pod "image-registry-5d98d9599b-vk284" (UID: "a910a9ba-6231-47b2-bd86-0c055b1cba96") : secret "image-registry-tls" not found Apr 24 19:06:52.176094 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:52.176067 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eea44eed-90c5-4bbc-b836-55ef49678cf3-cert\") pod \"ingress-canary-5sq4k\" (UID: \"eea44eed-90c5-4bbc-b836-55ef49678cf3\") " pod="openshift-ingress-canary/ingress-canary-5sq4k" Apr 24 19:06:52.176214 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:52.176162 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:06:52.176214 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:52.176213 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eea44eed-90c5-4bbc-b836-55ef49678cf3-cert podName:eea44eed-90c5-4bbc-b836-55ef49678cf3 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:08.176197803 +0000 UTC m=+65.293538172 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eea44eed-90c5-4bbc-b836-55ef49678cf3-cert") pod "ingress-canary-5sq4k" (UID: "eea44eed-90c5-4bbc-b836-55ef49678cf3") : secret "canary-serving-cert" not found Apr 24 19:06:54.494519 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:06:54.494484 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bnvtx\" (UID: \"9ab7016b-4eb4-436f-945f-9e7f777cdd5a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bnvtx" Apr 24 19:06:54.494965 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:54.494654 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 19:06:54.494965 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:06:54.494739 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-networking-console-plugin-cert podName:9ab7016b-4eb4-436f-945f-9e7f777cdd5a nodeName:}" failed. No retries permitted until 2026-04-24 19:07:10.494717875 +0000 UTC m=+67.612058258 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bnvtx" (UID: "9ab7016b-4eb4-436f-945f-9e7f777cdd5a") : secret "networking-console-plugin-cert" not found Apr 24 19:07:05.468798 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.468765 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5d98d9599b-vk284"] Apr 24 19:07:05.469223 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:07:05.468935 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-5d98d9599b-vk284" podUID="a910a9ba-6231-47b2-bd86-0c055b1cba96" Apr 24 19:07:05.530486 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.530458 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9hxjz"] Apr 24 19:07:05.533653 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.533633 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9hxjz" Apr 24 19:07:05.536223 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.536190 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 19:07:05.536223 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.536209 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 19:07:05.536223 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.536210 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-nzsf6\"" Apr 24 19:07:05.536408 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.536225 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 19:07:05.536565 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.536550 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 19:07:05.542352 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.542330 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9hxjz"] Apr 24 19:07:05.576717 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.576687 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/623c1c61-52af-445b-b5cd-8972d473f55d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9hxjz\" (UID: \"623c1c61-52af-445b-b5cd-8972d473f55d\") " pod="openshift-insights/insights-runtime-extractor-9hxjz" Apr 24 19:07:05.576841 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.576730 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/623c1c61-52af-445b-b5cd-8972d473f55d-crio-socket\") pod \"insights-runtime-extractor-9hxjz\" (UID: \"623c1c61-52af-445b-b5cd-8972d473f55d\") " pod="openshift-insights/insights-runtime-extractor-9hxjz" Apr 24 19:07:05.576841 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.576785 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7llg9\" (UniqueName: \"kubernetes.io/projected/623c1c61-52af-445b-b5cd-8972d473f55d-kube-api-access-7llg9\") pod \"insights-runtime-extractor-9hxjz\" (UID: \"623c1c61-52af-445b-b5cd-8972d473f55d\") " pod="openshift-insights/insights-runtime-extractor-9hxjz" Apr 24 19:07:05.576841 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.576830 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/623c1c61-52af-445b-b5cd-8972d473f55d-data-volume\") pod \"insights-runtime-extractor-9hxjz\" (UID: \"623c1c61-52af-445b-b5cd-8972d473f55d\") " pod="openshift-insights/insights-runtime-extractor-9hxjz" Apr 24 19:07:05.577008 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.576849 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/623c1c61-52af-445b-b5cd-8972d473f55d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9hxjz\" (UID: \"623c1c61-52af-445b-b5cd-8972d473f55d\") " pod="openshift-insights/insights-runtime-extractor-9hxjz" Apr 24 19:07:05.600792 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.600771 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-75d6c9bd47-mc8md"] Apr 24 19:07:05.603139 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.603125 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.618009 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.617983 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:07:05.619085 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.619065 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-75d6c9bd47-mc8md"] Apr 24 19:07:05.621774 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.621758 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:07:05.677859 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.677839 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a910a9ba-6231-47b2-bd86-0c055b1cba96-installation-pull-secrets\") pod \"a910a9ba-6231-47b2-bd86-0c055b1cba96\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " Apr 24 19:07:05.678010 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.677879 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a910a9ba-6231-47b2-bd86-0c055b1cba96-ca-trust-extracted\") pod \"a910a9ba-6231-47b2-bd86-0c055b1cba96\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " Apr 24 19:07:05.678010 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.677899 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a910a9ba-6231-47b2-bd86-0c055b1cba96-trusted-ca\") pod \"a910a9ba-6231-47b2-bd86-0c055b1cba96\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " Apr 24 19:07:05.678010 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.677916 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tntqf\" (UniqueName: \"kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-kube-api-access-tntqf\") pod \"a910a9ba-6231-47b2-bd86-0c055b1cba96\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " Apr 24 19:07:05.678010 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.677988 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-certificates\") pod \"a910a9ba-6231-47b2-bd86-0c055b1cba96\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " Apr 24 19:07:05.678208 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.678051 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-bound-sa-token\") pod \"a910a9ba-6231-47b2-bd86-0c055b1cba96\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " Apr 24 19:07:05.678208 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.678099 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a910a9ba-6231-47b2-bd86-0c055b1cba96-image-registry-private-configuration\") pod \"a910a9ba-6231-47b2-bd86-0c055b1cba96\" (UID: \"a910a9ba-6231-47b2-bd86-0c055b1cba96\") " Apr 24 19:07:05.678208 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.678190 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e292081-ae2a-40a6-b7fa-6d1463c221f2-trusted-ca\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.678352 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.678222 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8e292081-ae2a-40a6-b7fa-6d1463c221f2-registry-certificates\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.678352 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.678234 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a910a9ba-6231-47b2-bd86-0c055b1cba96-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a910a9ba-6231-47b2-bd86-0c055b1cba96" (UID: "a910a9ba-6231-47b2-bd86-0c055b1cba96"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:07:05.678352 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.678248 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8e292081-ae2a-40a6-b7fa-6d1463c221f2-installation-pull-secrets\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.678542 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.678389 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a910a9ba-6231-47b2-bd86-0c055b1cba96" (UID: "a910a9ba-6231-47b2-bd86-0c055b1cba96"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:07:05.678542 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.678411 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/623c1c61-52af-445b-b5cd-8972d473f55d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9hxjz\" (UID: \"623c1c61-52af-445b-b5cd-8972d473f55d\") " pod="openshift-insights/insights-runtime-extractor-9hxjz" Apr 24 19:07:05.678542 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.678489 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsz8z\" (UniqueName: \"kubernetes.io/projected/8e292081-ae2a-40a6-b7fa-6d1463c221f2-kube-api-access-rsz8z\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.678693 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.678543 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7llg9\" (UniqueName: \"kubernetes.io/projected/623c1c61-52af-445b-b5cd-8972d473f55d-kube-api-access-7llg9\") pod \"insights-runtime-extractor-9hxjz\" (UID: \"623c1c61-52af-445b-b5cd-8972d473f55d\") " pod="openshift-insights/insights-runtime-extractor-9hxjz" Apr 24 19:07:05.678693 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.678571 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/623c1c61-52af-445b-b5cd-8972d473f55d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9hxjz\" (UID: \"623c1c61-52af-445b-b5cd-8972d473f55d\") " pod="openshift-insights/insights-runtime-extractor-9hxjz" Apr 24 19:07:05.678693 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.678602 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8e292081-ae2a-40a6-b7fa-6d1463c221f2-image-registry-private-configuration\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.678693 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.678635 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/623c1c61-52af-445b-b5cd-8972d473f55d-data-volume\") pod \"insights-runtime-extractor-9hxjz\" (UID: \"623c1c61-52af-445b-b5cd-8972d473f55d\") " pod="openshift-insights/insights-runtime-extractor-9hxjz" Apr 24 19:07:05.678693 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.678682 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8e292081-ae2a-40a6-b7fa-6d1463c221f2-registry-tls\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.678980 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.678739 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a910a9ba-6231-47b2-bd86-0c055b1cba96-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a910a9ba-6231-47b2-bd86-0c055b1cba96" (UID: "a910a9ba-6231-47b2-bd86-0c055b1cba96"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:07:05.679090 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.679049 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/623c1c61-52af-445b-b5cd-8972d473f55d-crio-socket\") pod \"insights-runtime-extractor-9hxjz\" (UID: \"623c1c61-52af-445b-b5cd-8972d473f55d\") " pod="openshift-insights/insights-runtime-extractor-9hxjz" Apr 24 19:07:05.679148 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.679090 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8e292081-ae2a-40a6-b7fa-6d1463c221f2-ca-trust-extracted\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.679148 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.679125 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e292081-ae2a-40a6-b7fa-6d1463c221f2-bound-sa-token\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.679243 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.679160 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/623c1c61-52af-445b-b5cd-8972d473f55d-data-volume\") pod \"insights-runtime-extractor-9hxjz\" (UID: \"623c1c61-52af-445b-b5cd-8972d473f55d\") " pod="openshift-insights/insights-runtime-extractor-9hxjz" Apr 24 19:07:05.679243 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.679167 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a910a9ba-6231-47b2-bd86-0c055b1cba96-trusted-ca\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:07:05.679243 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.679212 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-certificates\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:07:05.679243 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.679217 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/623c1c61-52af-445b-b5cd-8972d473f55d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9hxjz\" (UID: \"623c1c61-52af-445b-b5cd-8972d473f55d\") " pod="openshift-insights/insights-runtime-extractor-9hxjz" Apr 24 19:07:05.679243 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.679225 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/623c1c61-52af-445b-b5cd-8972d473f55d-crio-socket\") pod \"insights-runtime-extractor-9hxjz\" (UID: \"623c1c61-52af-445b-b5cd-8972d473f55d\") " pod="openshift-insights/insights-runtime-extractor-9hxjz" Apr 24 19:07:05.679243 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.679232 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a910a9ba-6231-47b2-bd86-0c055b1cba96-ca-trust-extracted\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:07:05.680587 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.680560 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a910a9ba-6231-47b2-bd86-0c055b1cba96-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "a910a9ba-6231-47b2-bd86-0c055b1cba96" (UID: "a910a9ba-6231-47b2-bd86-0c055b1cba96"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:07:05.680882 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.680859 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-kube-api-access-tntqf" (OuterVolumeSpecName: "kube-api-access-tntqf") pod "a910a9ba-6231-47b2-bd86-0c055b1cba96" (UID: "a910a9ba-6231-47b2-bd86-0c055b1cba96"). InnerVolumeSpecName "kube-api-access-tntqf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:07:05.680993 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.680965 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a910a9ba-6231-47b2-bd86-0c055b1cba96" (UID: "a910a9ba-6231-47b2-bd86-0c055b1cba96"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:07:05.681228 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.681208 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a910a9ba-6231-47b2-bd86-0c055b1cba96-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a910a9ba-6231-47b2-bd86-0c055b1cba96" (UID: "a910a9ba-6231-47b2-bd86-0c055b1cba96"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:07:05.681538 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.681523 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/623c1c61-52af-445b-b5cd-8972d473f55d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9hxjz\" (UID: \"623c1c61-52af-445b-b5cd-8972d473f55d\") " pod="openshift-insights/insights-runtime-extractor-9hxjz" Apr 24 19:07:05.688329 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.688303 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7llg9\" (UniqueName: \"kubernetes.io/projected/623c1c61-52af-445b-b5cd-8972d473f55d-kube-api-access-7llg9\") pod \"insights-runtime-extractor-9hxjz\" (UID: \"623c1c61-52af-445b-b5cd-8972d473f55d\") " pod="openshift-insights/insights-runtime-extractor-9hxjz" Apr 24 19:07:05.780039 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.780008 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8e292081-ae2a-40a6-b7fa-6d1463c221f2-ca-trust-extracted\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.780251 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.780045 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e292081-ae2a-40a6-b7fa-6d1463c221f2-bound-sa-token\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.780251 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.780065 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e292081-ae2a-40a6-b7fa-6d1463c221f2-trusted-ca\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.780251 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.780083 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8e292081-ae2a-40a6-b7fa-6d1463c221f2-registry-certificates\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.780251 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.780101 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8e292081-ae2a-40a6-b7fa-6d1463c221f2-installation-pull-secrets\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.780251 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.780145 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rsz8z\" (UniqueName: \"kubernetes.io/projected/8e292081-ae2a-40a6-b7fa-6d1463c221f2-kube-api-access-rsz8z\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.780251 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.780175 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8e292081-ae2a-40a6-b7fa-6d1463c221f2-image-registry-private-configuration\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.780251 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.780211 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8e292081-ae2a-40a6-b7fa-6d1463c221f2-registry-tls\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.780651 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.780518 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8e292081-ae2a-40a6-b7fa-6d1463c221f2-ca-trust-extracted\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.780651 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.780627 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-bound-sa-token\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:07:05.780651 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.780647 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a910a9ba-6231-47b2-bd86-0c055b1cba96-image-registry-private-configuration\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:07:05.780809 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.780666 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a910a9ba-6231-47b2-bd86-0c055b1cba96-installation-pull-secrets\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:07:05.780809 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.780680 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tntqf\" (UniqueName: \"kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-kube-api-access-tntqf\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:07:05.781066 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.781025 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8e292081-ae2a-40a6-b7fa-6d1463c221f2-registry-certificates\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.781191 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.781066 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e292081-ae2a-40a6-b7fa-6d1463c221f2-trusted-ca\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.782764 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.782747 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8e292081-ae2a-40a6-b7fa-6d1463c221f2-installation-pull-secrets\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.782847 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.782829 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8e292081-ae2a-40a6-b7fa-6d1463c221f2-registry-tls\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.783111 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.783088 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8e292081-ae2a-40a6-b7fa-6d1463c221f2-image-registry-private-configuration\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.792094 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.792071 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e292081-ae2a-40a6-b7fa-6d1463c221f2-bound-sa-token\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.792243 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.792228 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsz8z\" (UniqueName: \"kubernetes.io/projected/8e292081-ae2a-40a6-b7fa-6d1463c221f2-kube-api-access-rsz8z\") pod \"image-registry-75d6c9bd47-mc8md\" (UID: \"8e292081-ae2a-40a6-b7fa-6d1463c221f2\") " pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.842600 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.842570 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9hxjz" Apr 24 19:07:05.913552 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.913486 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jb7lz\"" Apr 24 19:07:05.921702 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.921641 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:05.962106 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:05.962079 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9hxjz"] Apr 24 19:07:05.964211 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:07:05.964185 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod623c1c61_52af_445b_b5cd_8972d473f55d.slice/crio-608bf89c708c9e66bd3986f7aea55356c752646119b0cba59af4f646eb0da994 WatchSource:0}: Error finding container 608bf89c708c9e66bd3986f7aea55356c752646119b0cba59af4f646eb0da994: Status 404 returned error can't find the container with id 608bf89c708c9e66bd3986f7aea55356c752646119b0cba59af4f646eb0da994 Apr 24 19:07:06.049807 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:06.049777 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-75d6c9bd47-mc8md"] Apr 24 19:07:06.053519 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:07:06.053491 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e292081_ae2a_40a6_b7fa_6d1463c221f2.slice/crio-50459f2333acdda18df4e09afee3a0069901d148a7e0aed0ff01f06419e9e3f3 WatchSource:0}: Error finding container 50459f2333acdda18df4e09afee3a0069901d148a7e0aed0ff01f06419e9e3f3: Status 404 returned error can't find the container with id 50459f2333acdda18df4e09afee3a0069901d148a7e0aed0ff01f06419e9e3f3 Apr 24 19:07:06.623695 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:06.623621 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9hxjz" event={"ID":"623c1c61-52af-445b-b5cd-8972d473f55d","Type":"ContainerStarted","Data":"2bb2618108d54e212f28d2eb01780b0f73fc4a50062dca274634cc367719dd39"} Apr 24 19:07:06.623695 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:06.623658 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9hxjz" event={"ID":"623c1c61-52af-445b-b5cd-8972d473f55d","Type":"ContainerStarted","Data":"608bf89c708c9e66bd3986f7aea55356c752646119b0cba59af4f646eb0da994"} Apr 24 19:07:06.624942 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:06.624914 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d98d9599b-vk284" Apr 24 19:07:06.625072 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:06.624939 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" event={"ID":"8e292081-ae2a-40a6-b7fa-6d1463c221f2","Type":"ContainerStarted","Data":"e4e0b1db3e7daefb05fa4b5d53a390c03c0ce8f21dbffa0817073c639b682172"} Apr 24 19:07:06.625072 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:06.624968 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" event={"ID":"8e292081-ae2a-40a6-b7fa-6d1463c221f2","Type":"ContainerStarted","Data":"50459f2333acdda18df4e09afee3a0069901d148a7e0aed0ff01f06419e9e3f3"} Apr 24 19:07:06.625216 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:06.625203 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:06.647537 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:06.647496 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" podStartSLOduration=1.647482559 podStartE2EDuration="1.647482559s" podCreationTimestamp="2026-04-24 19:07:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:07:06.647053228 +0000 UTC m=+63.764393616" watchObservedRunningTime="2026-04-24 19:07:06.647482559 +0000 UTC m=+63.764822947" Apr 24 19:07:06.673999 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:06.673974 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5d98d9599b-vk284"] Apr 24 19:07:06.677618 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:06.677594 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5d98d9599b-vk284"] Apr 24 19:07:06.789704 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:06.789673 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a910a9ba-6231-47b2-bd86-0c055b1cba96-registry-tls\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:07:07.379194 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:07.379123 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a910a9ba-6231-47b2-bd86-0c055b1cba96" path="/var/lib/kubelet/pods/a910a9ba-6231-47b2-bd86-0c055b1cba96/volumes" Apr 24 19:07:07.628978 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:07.628944 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9hxjz" event={"ID":"623c1c61-52af-445b-b5cd-8972d473f55d","Type":"ContainerStarted","Data":"6a9f6fba7f5edea3aa7a2346042f2d5b327504532ad0d8c3a8e640df2952e599"} Apr 24 19:07:08.101978 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:08.101940 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/68ba56d1-dd32-46c1-8484-b4074baf3f3f-metrics-tls\") pod \"dns-default-6d2pw\" (UID: \"68ba56d1-dd32-46c1-8484-b4074baf3f3f\") " pod="openshift-dns/dns-default-6d2pw" Apr 24 19:07:08.104660 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:08.104632 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/68ba56d1-dd32-46c1-8484-b4074baf3f3f-metrics-tls\") pod \"dns-default-6d2pw\" (UID: \"68ba56d1-dd32-46c1-8484-b4074baf3f3f\") " pod="openshift-dns/dns-default-6d2pw" Apr 24 19:07:08.203225 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:08.203186 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eea44eed-90c5-4bbc-b836-55ef49678cf3-cert\") pod \"ingress-canary-5sq4k\" (UID: \"eea44eed-90c5-4bbc-b836-55ef49678cf3\") " pod="openshift-ingress-canary/ingress-canary-5sq4k" Apr 24 19:07:08.206225 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:08.206198 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eea44eed-90c5-4bbc-b836-55ef49678cf3-cert\") pod \"ingress-canary-5sq4k\" (UID: \"eea44eed-90c5-4bbc-b836-55ef49678cf3\") " pod="openshift-ingress-canary/ingress-canary-5sq4k" Apr 24 19:07:08.396678 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:08.396604 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-km7hc\"" Apr 24 19:07:08.405055 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:08.405022 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6d2pw" Apr 24 19:07:08.410136 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:08.410109 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jvjrt\"" Apr 24 19:07:08.418213 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:08.418195 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5sq4k" Apr 24 19:07:08.951667 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:08.951637 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6d2pw"] Apr 24 19:07:08.954898 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:07:08.954870 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68ba56d1_dd32_46c1_8484_b4074baf3f3f.slice/crio-117b4d295ea6d0908513253738fbedfee7f1b13e3ef6633357bcfb8f1263a60b WatchSource:0}: Error finding container 117b4d295ea6d0908513253738fbedfee7f1b13e3ef6633357bcfb8f1263a60b: Status 404 returned error can't find the container with id 117b4d295ea6d0908513253738fbedfee7f1b13e3ef6633357bcfb8f1263a60b Apr 24 19:07:08.966204 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:08.966182 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5sq4k"] Apr 24 19:07:08.970371 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:07:08.970349 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeea44eed_90c5_4bbc_b836_55ef49678cf3.slice/crio-e3c66d5121a87428ede1a6f75eccb4b1932368133378104c5c07c425b7c17e58 WatchSource:0}: Error finding container e3c66d5121a87428ede1a6f75eccb4b1932368133378104c5c07c425b7c17e58: Status 404 returned error can't find the container with id e3c66d5121a87428ede1a6f75eccb4b1932368133378104c5c07c425b7c17e58 Apr 24 19:07:09.110799 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:09.110719 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs\") pod \"network-metrics-daemon-tf94j\" (UID: \"b0872aa7-303f-4052-9d68-dd136609293b\") " pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:07:09.113182 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:09.113159 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0872aa7-303f-4052-9d68-dd136609293b-metrics-certs\") pod \"network-metrics-daemon-tf94j\" (UID: \"b0872aa7-303f-4052-9d68-dd136609293b\") " pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:07:09.396877 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:09.396805 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vpn2d\"" Apr 24 19:07:09.405216 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:09.405194 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tf94j" Apr 24 19:07:09.546015 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:09.545984 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tf94j"] Apr 24 19:07:09.549717 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:07:09.549678 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0872aa7_303f_4052_9d68_dd136609293b.slice/crio-9f724d2e7ebf04aa563fb908a6c9a97e1890b19e29e27e86009556fa5754a78e WatchSource:0}: Error finding container 9f724d2e7ebf04aa563fb908a6c9a97e1890b19e29e27e86009556fa5754a78e: Status 404 returned error can't find the container with id 9f724d2e7ebf04aa563fb908a6c9a97e1890b19e29e27e86009556fa5754a78e Apr 24 19:07:09.636749 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:09.636705 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9hxjz" event={"ID":"623c1c61-52af-445b-b5cd-8972d473f55d","Type":"ContainerStarted","Data":"af9d22e2320192abb93c66ed9bc87956a3ffcf3c4939e59bc69d659c51b2c0cd"} Apr 24 19:07:09.638688 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:09.638655 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tf94j" event={"ID":"b0872aa7-303f-4052-9d68-dd136609293b","Type":"ContainerStarted","Data":"9f724d2e7ebf04aa563fb908a6c9a97e1890b19e29e27e86009556fa5754a78e"} Apr 24 19:07:09.640021 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:09.639993 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6d2pw" event={"ID":"68ba56d1-dd32-46c1-8484-b4074baf3f3f","Type":"ContainerStarted","Data":"117b4d295ea6d0908513253738fbedfee7f1b13e3ef6633357bcfb8f1263a60b"} Apr 24 19:07:09.641296 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:09.641268 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5sq4k" event={"ID":"eea44eed-90c5-4bbc-b836-55ef49678cf3","Type":"ContainerStarted","Data":"e3c66d5121a87428ede1a6f75eccb4b1932368133378104c5c07c425b7c17e58"} Apr 24 19:07:09.656950 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:09.656862 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9hxjz" podStartSLOduration=1.447454667 podStartE2EDuration="4.656846135s" podCreationTimestamp="2026-04-24 19:07:05 +0000 UTC" firstStartedPulling="2026-04-24 19:07:06.031722987 +0000 UTC m=+63.149063353" lastFinishedPulling="2026-04-24 19:07:09.241114453 +0000 UTC m=+66.358454821" observedRunningTime="2026-04-24 19:07:09.65670821 +0000 UTC m=+66.774048610" watchObservedRunningTime="2026-04-24 19:07:09.656846135 +0000 UTC m=+66.774186524" Apr 24 19:07:10.523515 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:10.523465 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bnvtx\" (UID: \"9ab7016b-4eb4-436f-945f-9e7f777cdd5a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bnvtx" Apr 24 19:07:10.526455 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:10.526411 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9ab7016b-4eb4-436f-945f-9e7f777cdd5a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bnvtx\" (UID: \"9ab7016b-4eb4-436f-945f-9e7f777cdd5a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bnvtx" Apr 24 19:07:10.799710 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:10.799632 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-2jh7z\"" Apr 24 19:07:10.807316 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:10.807291 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bnvtx" Apr 24 19:07:12.139772 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:12.139743 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-bnvtx"] Apr 24 19:07:12.144564 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:07:12.144507 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ab7016b_4eb4_436f_945f_9e7f777cdd5a.slice/crio-da1ef90664000700f681ee142cc3551a371422840118ecba7a6f0855f8eea03a WatchSource:0}: Error finding container da1ef90664000700f681ee142cc3551a371422840118ecba7a6f0855f8eea03a: Status 404 returned error can't find the container with id da1ef90664000700f681ee142cc3551a371422840118ecba7a6f0855f8eea03a Apr 24 19:07:12.562916 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:12.562844 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-nrb24" Apr 24 19:07:12.653412 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:12.653375 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6d2pw" event={"ID":"68ba56d1-dd32-46c1-8484-b4074baf3f3f","Type":"ContainerStarted","Data":"4654c57f582e1ea48d065f6866d6b69ca8bd0e0dd7c72f0169f168db98d75aee"} Apr 24 19:07:12.653593 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:12.653423 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6d2pw" event={"ID":"68ba56d1-dd32-46c1-8484-b4074baf3f3f","Type":"ContainerStarted","Data":"238f8f5c95205968a76be81c75466fa601c4af89e36e6c581a03b41516ce77dc"} Apr 24 19:07:12.653593 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:12.653487 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6d2pw" Apr 24 19:07:12.654683 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:12.654659 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5sq4k" event={"ID":"eea44eed-90c5-4bbc-b836-55ef49678cf3","Type":"ContainerStarted","Data":"cf7fc1a15051354f4d4c0e435499fd3fcde22d91479ac66cd578a134c2ff7a88"} Apr 24 19:07:12.655700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:12.655680 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bnvtx" event={"ID":"9ab7016b-4eb4-436f-945f-9e7f777cdd5a","Type":"ContainerStarted","Data":"da1ef90664000700f681ee142cc3551a371422840118ecba7a6f0855f8eea03a"} Apr 24 19:07:12.657025 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:12.656997 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tf94j" event={"ID":"b0872aa7-303f-4052-9d68-dd136609293b","Type":"ContainerStarted","Data":"ef68a86a53ff376b3342573a94b334a4f4f0baabb35e84c8f0abdfef86f327b1"} Apr 24 19:07:12.657105 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:12.657021 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tf94j" event={"ID":"b0872aa7-303f-4052-9d68-dd136609293b","Type":"ContainerStarted","Data":"0a85a00d9041c650a12a4015d9dbeb1a3ba54e496fa10f14fd41fa5f0bbd13b6"} Apr 24 19:07:12.695701 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:12.695659 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6d2pw" podStartSLOduration=33.652816898 podStartE2EDuration="36.695647863s" podCreationTimestamp="2026-04-24 19:06:36 +0000 UTC" firstStartedPulling="2026-04-24 19:07:08.957374253 +0000 UTC m=+66.074714623" lastFinishedPulling="2026-04-24 19:07:12.000205195 +0000 UTC m=+69.117545588" observedRunningTime="2026-04-24 19:07:12.675023202 +0000 UTC m=+69.792363589" watchObservedRunningTime="2026-04-24 19:07:12.695647863 +0000 UTC m=+69.812988268" Apr 24 19:07:12.696089 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:12.696065 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5sq4k" podStartSLOduration=33.666806801 podStartE2EDuration="36.696060027s" podCreationTimestamp="2026-04-24 19:06:36 +0000 UTC" firstStartedPulling="2026-04-24 19:07:08.972162162 +0000 UTC m=+66.089502531" lastFinishedPulling="2026-04-24 19:07:12.001415379 +0000 UTC m=+69.118755757" observedRunningTime="2026-04-24 19:07:12.695283142 +0000 UTC m=+69.812623529" watchObservedRunningTime="2026-04-24 19:07:12.696060027 +0000 UTC m=+69.813400415" Apr 24 19:07:12.712410 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:12.712361 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-tf94j" podStartSLOduration=67.261159768 podStartE2EDuration="1m9.712343098s" podCreationTimestamp="2026-04-24 19:06:03 +0000 UTC" firstStartedPulling="2026-04-24 19:07:09.552423046 +0000 UTC m=+66.669763424" lastFinishedPulling="2026-04-24 19:07:12.003606386 +0000 UTC m=+69.120946754" observedRunningTime="2026-04-24 19:07:12.711300343 +0000 UTC m=+69.828640730" watchObservedRunningTime="2026-04-24 19:07:12.712343098 +0000 UTC m=+69.829683487" Apr 24 19:07:14.663371 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:14.663337 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bnvtx" event={"ID":"9ab7016b-4eb4-436f-945f-9e7f777cdd5a","Type":"ContainerStarted","Data":"b378749388e654f17dae75bbd0b7ee6b4a2ff976b682a92ba125c2714d9dd9ef"} Apr 24 19:07:14.680281 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:14.680236 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bnvtx" podStartSLOduration=35.09762837 podStartE2EDuration="36.680223201s" podCreationTimestamp="2026-04-24 19:06:38 +0000 UTC" firstStartedPulling="2026-04-24 19:07:12.146933346 +0000 UTC m=+69.264273712" lastFinishedPulling="2026-04-24 19:07:13.729528173 +0000 UTC m=+70.846868543" observedRunningTime="2026-04-24 19:07:14.679075286 +0000 UTC m=+71.796415698" watchObservedRunningTime="2026-04-24 19:07:14.680223201 +0000 UTC m=+71.797563588" Apr 24 19:07:22.661854 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.661716 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6d2pw" Apr 24 19:07:22.692773 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.692737 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-sg2th"] Apr 24 19:07:22.695311 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.695290 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.697871 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.697848 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 19:07:22.697994 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.697925 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-m9g75\"" Apr 24 19:07:22.699374 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.699243 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 19:07:22.699374 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.699271 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 19:07:22.699374 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.699250 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 19:07:22.699581 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.699501 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 19:07:22.699581 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.699574 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 19:07:22.807426 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.807354 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3989dc21-ce63-4e0b-b43a-94257a2b6be9-node-exporter-tls\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.807624 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.807540 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3989dc21-ce63-4e0b-b43a-94257a2b6be9-root\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.807624 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.807570 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3989dc21-ce63-4e0b-b43a-94257a2b6be9-metrics-client-ca\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.807727 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.807623 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3989dc21-ce63-4e0b-b43a-94257a2b6be9-node-exporter-accelerators-collector-config\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.807727 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.807661 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3989dc21-ce63-4e0b-b43a-94257a2b6be9-node-exporter-wtmp\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.807810 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.807766 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3989dc21-ce63-4e0b-b43a-94257a2b6be9-sys\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.807810 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.807793 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pntj7\" (UniqueName: \"kubernetes.io/projected/3989dc21-ce63-4e0b-b43a-94257a2b6be9-kube-api-access-pntj7\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.807902 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.807853 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3989dc21-ce63-4e0b-b43a-94257a2b6be9-node-exporter-textfile\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.807951 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.807897 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3989dc21-ce63-4e0b-b43a-94257a2b6be9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.908719 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.908685 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3989dc21-ce63-4e0b-b43a-94257a2b6be9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.908719 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.908721 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3989dc21-ce63-4e0b-b43a-94257a2b6be9-node-exporter-tls\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.908932 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.908767 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3989dc21-ce63-4e0b-b43a-94257a2b6be9-root\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.908932 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.908791 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3989dc21-ce63-4e0b-b43a-94257a2b6be9-metrics-client-ca\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.908932 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.908844 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3989dc21-ce63-4e0b-b43a-94257a2b6be9-node-exporter-accelerators-collector-config\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.908932 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.908863 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3989dc21-ce63-4e0b-b43a-94257a2b6be9-root\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.908932 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.908876 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3989dc21-ce63-4e0b-b43a-94257a2b6be9-node-exporter-wtmp\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.908932 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.908911 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3989dc21-ce63-4e0b-b43a-94257a2b6be9-sys\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.909195 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.908939 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pntj7\" (UniqueName: \"kubernetes.io/projected/3989dc21-ce63-4e0b-b43a-94257a2b6be9-kube-api-access-pntj7\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.909195 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.908981 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3989dc21-ce63-4e0b-b43a-94257a2b6be9-node-exporter-wtmp\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.909195 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.908982 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3989dc21-ce63-4e0b-b43a-94257a2b6be9-sys\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.909195 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.908997 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3989dc21-ce63-4e0b-b43a-94257a2b6be9-node-exporter-textfile\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.909408 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.909278 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3989dc21-ce63-4e0b-b43a-94257a2b6be9-node-exporter-textfile\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.909501 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.909460 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3989dc21-ce63-4e0b-b43a-94257a2b6be9-metrics-client-ca\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.909585 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.909564 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3989dc21-ce63-4e0b-b43a-94257a2b6be9-node-exporter-accelerators-collector-config\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.911290 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.911269 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3989dc21-ce63-4e0b-b43a-94257a2b6be9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.911470 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.911453 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3989dc21-ce63-4e0b-b43a-94257a2b6be9-node-exporter-tls\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:22.925392 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:22.925331 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pntj7\" (UniqueName: \"kubernetes.io/projected/3989dc21-ce63-4e0b-b43a-94257a2b6be9-kube-api-access-pntj7\") pod \"node-exporter-sg2th\" (UID: \"3989dc21-ce63-4e0b-b43a-94257a2b6be9\") " pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:23.007390 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:23.007363 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-sg2th" Apr 24 19:07:23.015375 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:07:23.015342 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3989dc21_ce63_4e0b_b43a_94257a2b6be9.slice/crio-ab6810fb845be284d1b28c194d3a66493c8a38bb3c172ed6563b1ab8c05daee9 WatchSource:0}: Error finding container ab6810fb845be284d1b28c194d3a66493c8a38bb3c172ed6563b1ab8c05daee9: Status 404 returned error can't find the container with id ab6810fb845be284d1b28c194d3a66493c8a38bb3c172ed6563b1ab8c05daee9 Apr 24 19:07:23.685622 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:23.685583 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sg2th" event={"ID":"3989dc21-ce63-4e0b-b43a-94257a2b6be9","Type":"ContainerStarted","Data":"ab6810fb845be284d1b28c194d3a66493c8a38bb3c172ed6563b1ab8c05daee9"} Apr 24 19:07:24.694025 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:24.693989 2573 generic.go:358] "Generic (PLEG): container finished" podID="3989dc21-ce63-4e0b-b43a-94257a2b6be9" containerID="5bb11877412589f97dcb9de0bb7cfd0d3063edeeecbb30aa8f7b65b638aff46b" exitCode=0 Apr 24 19:07:24.694025 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:24.694029 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sg2th" event={"ID":"3989dc21-ce63-4e0b-b43a-94257a2b6be9","Type":"ContainerDied","Data":"5bb11877412589f97dcb9de0bb7cfd0d3063edeeecbb30aa8f7b65b638aff46b"} Apr 24 19:07:25.698619 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:25.698581 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sg2th" event={"ID":"3989dc21-ce63-4e0b-b43a-94257a2b6be9","Type":"ContainerStarted","Data":"81694cfb6aafb4136d5f8ddfb7e9a8662025057ab7a5ed719151fd926d3fbbc2"} Apr 24 19:07:25.698619 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:25.698616 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sg2th" event={"ID":"3989dc21-ce63-4e0b-b43a-94257a2b6be9","Type":"ContainerStarted","Data":"c4b3c8598e93f4ed1aeb257e4bf285c41844e67a49555ceb47061fedd256eaf2"} Apr 24 19:07:25.721665 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:25.721618 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-sg2th" podStartSLOduration=2.918814028 podStartE2EDuration="3.721604093s" podCreationTimestamp="2026-04-24 19:07:22 +0000 UTC" firstStartedPulling="2026-04-24 19:07:23.017187509 +0000 UTC m=+80.134527876" lastFinishedPulling="2026-04-24 19:07:23.819977563 +0000 UTC m=+80.937317941" observedRunningTime="2026-04-24 19:07:25.720318503 +0000 UTC m=+82.837658904" watchObservedRunningTime="2026-04-24 19:07:25.721604093 +0000 UTC m=+82.838944502" Apr 24 19:07:25.926261 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:25.926228 2573 patch_prober.go:28] interesting pod/image-registry-75d6c9bd47-mc8md container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 19:07:25.926405 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:25.926281 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" podUID="8e292081-ae2a-40a6-b7fa-6d1463c221f2" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:07:27.293842 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.293807 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-77d8db7bb6-r2jtz"] Apr 24 19:07:27.295663 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.295646 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.299784 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.299754 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 19:07:27.299923 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.299754 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 19:07:27.299923 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.299798 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 19:07:27.299923 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.299828 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-5l7ba57kggbuc\"" Apr 24 19:07:27.299923 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.299850 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 19:07:27.299923 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.299827 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-tc84s\"" Apr 24 19:07:27.307720 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.307701 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-77d8db7bb6-r2jtz"] Apr 24 19:07:27.442599 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.442570 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/84edfa12-7053-4df2-8b08-9c1938baf06c-metrics-server-audit-profiles\") pod \"metrics-server-77d8db7bb6-r2jtz\" (UID: \"84edfa12-7053-4df2-8b08-9c1938baf06c\") " pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.442777 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.442607 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84edfa12-7053-4df2-8b08-9c1938baf06c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-77d8db7bb6-r2jtz\" (UID: \"84edfa12-7053-4df2-8b08-9c1938baf06c\") " pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.442777 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.442689 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/84edfa12-7053-4df2-8b08-9c1938baf06c-secret-metrics-server-client-certs\") pod \"metrics-server-77d8db7bb6-r2jtz\" (UID: \"84edfa12-7053-4df2-8b08-9c1938baf06c\") " pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.442777 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.442743 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/84edfa12-7053-4df2-8b08-9c1938baf06c-audit-log\") pod \"metrics-server-77d8db7bb6-r2jtz\" (UID: \"84edfa12-7053-4df2-8b08-9c1938baf06c\") " pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.442933 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.442810 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/84edfa12-7053-4df2-8b08-9c1938baf06c-secret-metrics-server-tls\") pod \"metrics-server-77d8db7bb6-r2jtz\" (UID: \"84edfa12-7053-4df2-8b08-9c1938baf06c\") " pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.442933 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.442849 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv5zb\" (UniqueName: \"kubernetes.io/projected/84edfa12-7053-4df2-8b08-9c1938baf06c-kube-api-access-mv5zb\") pod \"metrics-server-77d8db7bb6-r2jtz\" (UID: \"84edfa12-7053-4df2-8b08-9c1938baf06c\") " pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.442933 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.442892 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84edfa12-7053-4df2-8b08-9c1938baf06c-client-ca-bundle\") pod \"metrics-server-77d8db7bb6-r2jtz\" (UID: \"84edfa12-7053-4df2-8b08-9c1938baf06c\") " pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.543911 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.543843 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/84edfa12-7053-4df2-8b08-9c1938baf06c-audit-log\") pod \"metrics-server-77d8db7bb6-r2jtz\" (UID: \"84edfa12-7053-4df2-8b08-9c1938baf06c\") " pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.543911 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.543883 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/84edfa12-7053-4df2-8b08-9c1938baf06c-secret-metrics-server-tls\") pod \"metrics-server-77d8db7bb6-r2jtz\" (UID: \"84edfa12-7053-4df2-8b08-9c1938baf06c\") " pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.543911 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.543905 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mv5zb\" (UniqueName: \"kubernetes.io/projected/84edfa12-7053-4df2-8b08-9c1938baf06c-kube-api-access-mv5zb\") pod \"metrics-server-77d8db7bb6-r2jtz\" (UID: \"84edfa12-7053-4df2-8b08-9c1938baf06c\") " pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.544141 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.543931 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84edfa12-7053-4df2-8b08-9c1938baf06c-client-ca-bundle\") pod \"metrics-server-77d8db7bb6-r2jtz\" (UID: \"84edfa12-7053-4df2-8b08-9c1938baf06c\") " pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.544141 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.543975 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/84edfa12-7053-4df2-8b08-9c1938baf06c-metrics-server-audit-profiles\") pod \"metrics-server-77d8db7bb6-r2jtz\" (UID: \"84edfa12-7053-4df2-8b08-9c1938baf06c\") " pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.544141 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.544004 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84edfa12-7053-4df2-8b08-9c1938baf06c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-77d8db7bb6-r2jtz\" (UID: \"84edfa12-7053-4df2-8b08-9c1938baf06c\") " pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.544141 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.544054 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/84edfa12-7053-4df2-8b08-9c1938baf06c-secret-metrics-server-client-certs\") pod \"metrics-server-77d8db7bb6-r2jtz\" (UID: \"84edfa12-7053-4df2-8b08-9c1938baf06c\") " pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.544321 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.544285 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/84edfa12-7053-4df2-8b08-9c1938baf06c-audit-log\") pod \"metrics-server-77d8db7bb6-r2jtz\" (UID: \"84edfa12-7053-4df2-8b08-9c1938baf06c\") " pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.545029 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.545008 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/84edfa12-7053-4df2-8b08-9c1938baf06c-metrics-server-audit-profiles\") pod \"metrics-server-77d8db7bb6-r2jtz\" (UID: \"84edfa12-7053-4df2-8b08-9c1938baf06c\") " pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.545158 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.545079 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84edfa12-7053-4df2-8b08-9c1938baf06c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-77d8db7bb6-r2jtz\" (UID: \"84edfa12-7053-4df2-8b08-9c1938baf06c\") " pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.546545 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.546521 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/84edfa12-7053-4df2-8b08-9c1938baf06c-secret-metrics-server-tls\") pod \"metrics-server-77d8db7bb6-r2jtz\" (UID: \"84edfa12-7053-4df2-8b08-9c1938baf06c\") " pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.546700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.546678 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84edfa12-7053-4df2-8b08-9c1938baf06c-client-ca-bundle\") pod \"metrics-server-77d8db7bb6-r2jtz\" (UID: \"84edfa12-7053-4df2-8b08-9c1938baf06c\") " pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.546873 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.546845 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/84edfa12-7053-4df2-8b08-9c1938baf06c-secret-metrics-server-client-certs\") pod \"metrics-server-77d8db7bb6-r2jtz\" (UID: \"84edfa12-7053-4df2-8b08-9c1938baf06c\") " pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.552078 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.552051 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv5zb\" (UniqueName: \"kubernetes.io/projected/84edfa12-7053-4df2-8b08-9c1938baf06c-kube-api-access-mv5zb\") pod \"metrics-server-77d8db7bb6-r2jtz\" (UID: \"84edfa12-7053-4df2-8b08-9c1938baf06c\") " pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.604587 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.604562 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:27.633866 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.633842 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-75d6c9bd47-mc8md" Apr 24 19:07:27.754474 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:27.754420 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-77d8db7bb6-r2jtz"] Apr 24 19:07:27.757940 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:07:27.757916 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84edfa12_7053_4df2_8b08_9c1938baf06c.slice/crio-f965f8a10eae325fbe5fdff033f77ed39eddd3aa8b3d5a1f8b614a5a4c7f9c57 WatchSource:0}: Error finding container f965f8a10eae325fbe5fdff033f77ed39eddd3aa8b3d5a1f8b614a5a4c7f9c57: Status 404 returned error can't find the container with id f965f8a10eae325fbe5fdff033f77ed39eddd3aa8b3d5a1f8b614a5a4c7f9c57 Apr 24 19:07:28.711493 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:28.711458 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" event={"ID":"84edfa12-7053-4df2-8b08-9c1938baf06c","Type":"ContainerStarted","Data":"f965f8a10eae325fbe5fdff033f77ed39eddd3aa8b3d5a1f8b614a5a4c7f9c57"} Apr 24 19:07:29.715962 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:29.715930 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" event={"ID":"84edfa12-7053-4df2-8b08-9c1938baf06c","Type":"ContainerStarted","Data":"fb0fe82453841c5edd59c04fce951489cb1cb1afc130545cbcd9cbf70cd14b34"} Apr 24 19:07:29.739975 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:29.739925 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" podStartSLOduration=1.440593466 podStartE2EDuration="2.739908448s" podCreationTimestamp="2026-04-24 19:07:27 +0000 UTC" firstStartedPulling="2026-04-24 19:07:27.759904128 +0000 UTC m=+84.877244498" lastFinishedPulling="2026-04-24 19:07:29.059219115 +0000 UTC m=+86.176559480" observedRunningTime="2026-04-24 19:07:29.738213927 +0000 UTC m=+86.855554315" watchObservedRunningTime="2026-04-24 19:07:29.739908448 +0000 UTC m=+86.857248839" Apr 24 19:07:31.421026 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:31.420991 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-jwr7b"] Apr 24 19:07:31.423114 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:31.423091 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-jwr7b" Apr 24 19:07:31.425542 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:31.425520 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-f66s2\"" Apr 24 19:07:31.425812 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:31.425793 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 19:07:31.425885 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:31.425869 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 19:07:31.436354 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:31.436335 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-jwr7b"] Apr 24 19:07:31.574051 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:31.574003 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62bcm\" (UniqueName: \"kubernetes.io/projected/96655824-83e2-40dd-9a26-dc820359dbe0-kube-api-access-62bcm\") pod \"downloads-6bcc868b7-jwr7b\" (UID: \"96655824-83e2-40dd-9a26-dc820359dbe0\") " pod="openshift-console/downloads-6bcc868b7-jwr7b" Apr 24 19:07:31.675166 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:31.675088 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62bcm\" (UniqueName: \"kubernetes.io/projected/96655824-83e2-40dd-9a26-dc820359dbe0-kube-api-access-62bcm\") pod \"downloads-6bcc868b7-jwr7b\" (UID: \"96655824-83e2-40dd-9a26-dc820359dbe0\") " pod="openshift-console/downloads-6bcc868b7-jwr7b" Apr 24 19:07:31.683912 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:31.683878 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62bcm\" (UniqueName: \"kubernetes.io/projected/96655824-83e2-40dd-9a26-dc820359dbe0-kube-api-access-62bcm\") pod \"downloads-6bcc868b7-jwr7b\" (UID: \"96655824-83e2-40dd-9a26-dc820359dbe0\") " pod="openshift-console/downloads-6bcc868b7-jwr7b" Apr 24 19:07:31.732517 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:31.732478 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-jwr7b" Apr 24 19:07:31.850765 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:31.850726 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-jwr7b"] Apr 24 19:07:31.854730 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:07:31.854698 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96655824_83e2_40dd_9a26_dc820359dbe0.slice/crio-ec53ac52721c717298a4f57a0442e3dd97e9c6198300232183dee5a7d9ebdfaa WatchSource:0}: Error finding container ec53ac52721c717298a4f57a0442e3dd97e9c6198300232183dee5a7d9ebdfaa: Status 404 returned error can't find the container with id ec53ac52721c717298a4f57a0442e3dd97e9c6198300232183dee5a7d9ebdfaa Apr 24 19:07:32.727654 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:32.727610 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-jwr7b" event={"ID":"96655824-83e2-40dd-9a26-dc820359dbe0","Type":"ContainerStarted","Data":"ec53ac52721c717298a4f57a0442e3dd97e9c6198300232183dee5a7d9ebdfaa"} Apr 24 19:07:47.605009 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:47.604972 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:47.605561 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:47.605049 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:07:48.772713 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:48.772670 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-jwr7b" event={"ID":"96655824-83e2-40dd-9a26-dc820359dbe0","Type":"ContainerStarted","Data":"be05cd7a58bf5dc7846419aa17a2f6479b8192412b0391bdd1a835e9f52290d0"} Apr 24 19:07:48.773109 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:48.772934 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-jwr7b" Apr 24 19:07:48.787593 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:48.787565 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-jwr7b" Apr 24 19:07:48.797727 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:48.797675 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-jwr7b" podStartSLOduration=1.8366347269999999 podStartE2EDuration="17.797662367s" podCreationTimestamp="2026-04-24 19:07:31 +0000 UTC" firstStartedPulling="2026-04-24 19:07:31.856674178 +0000 UTC m=+88.974014551" lastFinishedPulling="2026-04-24 19:07:47.817701826 +0000 UTC m=+104.935042191" observedRunningTime="2026-04-24 19:07:48.7956546 +0000 UTC m=+105.912995013" watchObservedRunningTime="2026-04-24 19:07:48.797662367 +0000 UTC m=+105.915002791" Apr 24 19:07:50.125258 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.125221 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-68c88b7b84-hgfkc"] Apr 24 19:07:50.139341 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.139315 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68c88b7b84-hgfkc"] Apr 24 19:07:50.139498 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.139459 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.142221 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.142200 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 19:07:50.142349 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.142224 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-g5dfp\"" Apr 24 19:07:50.142514 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.142492 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 19:07:50.144507 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.144491 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 19:07:50.144720 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.144706 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 19:07:50.144908 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.144896 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 19:07:50.150041 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.150020 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 19:07:50.229384 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.229345 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-oauth-serving-cert\") pod \"console-68c88b7b84-hgfkc\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.229384 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.229392 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8d1cf16-8ec3-4a34-b562-403a022e73bb-console-serving-cert\") pod \"console-68c88b7b84-hgfkc\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.229643 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.229407 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-trusted-ca-bundle\") pod \"console-68c88b7b84-hgfkc\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.229643 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.229502 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8d1cf16-8ec3-4a34-b562-403a022e73bb-console-oauth-config\") pod \"console-68c88b7b84-hgfkc\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.229643 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.229543 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7hzc\" (UniqueName: \"kubernetes.io/projected/b8d1cf16-8ec3-4a34-b562-403a022e73bb-kube-api-access-d7hzc\") pod \"console-68c88b7b84-hgfkc\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.229643 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.229564 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-console-config\") pod \"console-68c88b7b84-hgfkc\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.229643 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.229596 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-service-ca\") pod \"console-68c88b7b84-hgfkc\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.330854 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.330814 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-oauth-serving-cert\") pod \"console-68c88b7b84-hgfkc\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.330854 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.330869 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8d1cf16-8ec3-4a34-b562-403a022e73bb-console-serving-cert\") pod \"console-68c88b7b84-hgfkc\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.331097 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.331001 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-trusted-ca-bundle\") pod \"console-68c88b7b84-hgfkc\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.331097 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.331054 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8d1cf16-8ec3-4a34-b562-403a022e73bb-console-oauth-config\") pod \"console-68c88b7b84-hgfkc\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.331194 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.331100 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7hzc\" (UniqueName: \"kubernetes.io/projected/b8d1cf16-8ec3-4a34-b562-403a022e73bb-kube-api-access-d7hzc\") pod \"console-68c88b7b84-hgfkc\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.331194 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.331131 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-console-config\") pod \"console-68c88b7b84-hgfkc\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.331194 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.331167 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-service-ca\") pod \"console-68c88b7b84-hgfkc\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.331715 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.331688 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-oauth-serving-cert\") pod \"console-68c88b7b84-hgfkc\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.331853 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.331840 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-service-ca\") pod \"console-68c88b7b84-hgfkc\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.331853 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.331844 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-console-config\") pod \"console-68c88b7b84-hgfkc\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.332206 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.332181 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-trusted-ca-bundle\") pod \"console-68c88b7b84-hgfkc\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.333960 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.333932 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8d1cf16-8ec3-4a34-b562-403a022e73bb-console-serving-cert\") pod \"console-68c88b7b84-hgfkc\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.333960 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.333950 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8d1cf16-8ec3-4a34-b562-403a022e73bb-console-oauth-config\") pod \"console-68c88b7b84-hgfkc\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.343953 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.343923 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7hzc\" (UniqueName: \"kubernetes.io/projected/b8d1cf16-8ec3-4a34-b562-403a022e73bb-kube-api-access-d7hzc\") pod \"console-68c88b7b84-hgfkc\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.452411 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.452332 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:07:50.604335 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.604310 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68c88b7b84-hgfkc"] Apr 24 19:07:50.606659 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:07:50.606624 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8d1cf16_8ec3_4a34_b562_403a022e73bb.slice/crio-b66a9b76f901656da4693767f1aa85b48070f83e0eadc1eab27e1534740a6ac7 WatchSource:0}: Error finding container b66a9b76f901656da4693767f1aa85b48070f83e0eadc1eab27e1534740a6ac7: Status 404 returned error can't find the container with id b66a9b76f901656da4693767f1aa85b48070f83e0eadc1eab27e1534740a6ac7 Apr 24 19:07:50.780214 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:50.780169 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68c88b7b84-hgfkc" event={"ID":"b8d1cf16-8ec3-4a34-b562-403a022e73bb","Type":"ContainerStarted","Data":"b66a9b76f901656da4693767f1aa85b48070f83e0eadc1eab27e1534740a6ac7"} Apr 24 19:07:54.794857 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:54.794769 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68c88b7b84-hgfkc" event={"ID":"b8d1cf16-8ec3-4a34-b562-403a022e73bb","Type":"ContainerStarted","Data":"c969dca1747cb362729845175566ca312798aeeb018979a212d398a3ecd23036"} Apr 24 19:07:54.815797 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:07:54.815749 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68c88b7b84-hgfkc" podStartSLOduration=0.892425492 podStartE2EDuration="4.815733605s" podCreationTimestamp="2026-04-24 19:07:50 +0000 UTC" firstStartedPulling="2026-04-24 19:07:50.608912523 +0000 UTC m=+107.726252890" lastFinishedPulling="2026-04-24 19:07:54.532220631 +0000 UTC m=+111.649561003" observedRunningTime="2026-04-24 19:07:54.814852797 +0000 UTC m=+111.932193179" watchObservedRunningTime="2026-04-24 19:07:54.815733605 +0000 UTC m=+111.933073996" Apr 24 19:08:00.452968 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:00.452932 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:08:00.452968 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:00.452969 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:08:00.457889 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:00.457869 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:08:00.814605 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:00.814578 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:08:07.610742 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:07.610712 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:08:07.614397 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:07.614370 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-77d8db7bb6-r2jtz" Apr 24 19:08:20.315152 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.315113 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d9958dbbd-zwwrs"] Apr 24 19:08:20.333526 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.333494 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d9958dbbd-zwwrs"] Apr 24 19:08:20.333664 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.333606 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.470592 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.470553 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/192ebc14-9357-4ab7-92b8-b5a005eb3693-console-oauth-config\") pod \"console-6d9958dbbd-zwwrs\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.470592 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.470593 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-trusted-ca-bundle\") pod \"console-6d9958dbbd-zwwrs\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.470792 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.470617 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-service-ca\") pod \"console-6d9958dbbd-zwwrs\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.470792 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.470670 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-oauth-serving-cert\") pod \"console-6d9958dbbd-zwwrs\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.470792 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.470716 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-console-config\") pod \"console-6d9958dbbd-zwwrs\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.470792 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.470736 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/192ebc14-9357-4ab7-92b8-b5a005eb3693-console-serving-cert\") pod \"console-6d9958dbbd-zwwrs\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.470792 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.470754 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pb2h\" (UniqueName: \"kubernetes.io/projected/192ebc14-9357-4ab7-92b8-b5a005eb3693-kube-api-access-4pb2h\") pod \"console-6d9958dbbd-zwwrs\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.571386 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.571307 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pb2h\" (UniqueName: \"kubernetes.io/projected/192ebc14-9357-4ab7-92b8-b5a005eb3693-kube-api-access-4pb2h\") pod \"console-6d9958dbbd-zwwrs\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.571386 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.571366 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/192ebc14-9357-4ab7-92b8-b5a005eb3693-console-oauth-config\") pod \"console-6d9958dbbd-zwwrs\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.571628 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.571392 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-trusted-ca-bundle\") pod \"console-6d9958dbbd-zwwrs\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.571628 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.571472 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-service-ca\") pod \"console-6d9958dbbd-zwwrs\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.571628 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.571503 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-oauth-serving-cert\") pod \"console-6d9958dbbd-zwwrs\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.571628 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.571555 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-console-config\") pod \"console-6d9958dbbd-zwwrs\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.571628 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.571586 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/192ebc14-9357-4ab7-92b8-b5a005eb3693-console-serving-cert\") pod \"console-6d9958dbbd-zwwrs\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.572259 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.572229 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-service-ca\") pod \"console-6d9958dbbd-zwwrs\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.572381 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.572337 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-console-config\") pod \"console-6d9958dbbd-zwwrs\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.572625 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.572599 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-trusted-ca-bundle\") pod \"console-6d9958dbbd-zwwrs\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.574130 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.574102 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/192ebc14-9357-4ab7-92b8-b5a005eb3693-console-oauth-config\") pod \"console-6d9958dbbd-zwwrs\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.574232 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.574145 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/192ebc14-9357-4ab7-92b8-b5a005eb3693-console-serving-cert\") pod \"console-6d9958dbbd-zwwrs\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.581055 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.581031 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pb2h\" (UniqueName: \"kubernetes.io/projected/192ebc14-9357-4ab7-92b8-b5a005eb3693-kube-api-access-4pb2h\") pod \"console-6d9958dbbd-zwwrs\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.583251 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.583225 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-oauth-serving-cert\") pod \"console-6d9958dbbd-zwwrs\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.642193 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.642160 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:20.772409 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.772381 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d9958dbbd-zwwrs"] Apr 24 19:08:20.775103 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:08:20.775079 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod192ebc14_9357_4ab7_92b8_b5a005eb3693.slice/crio-ee809ff1aa94d9144ea01a9fb17a048cf1846508234de8a7ec05008d8a1f40ce WatchSource:0}: Error finding container ee809ff1aa94d9144ea01a9fb17a048cf1846508234de8a7ec05008d8a1f40ce: Status 404 returned error can't find the container with id ee809ff1aa94d9144ea01a9fb17a048cf1846508234de8a7ec05008d8a1f40ce Apr 24 19:08:20.863134 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:20.863104 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d9958dbbd-zwwrs" event={"ID":"192ebc14-9357-4ab7-92b8-b5a005eb3693","Type":"ContainerStarted","Data":"ee809ff1aa94d9144ea01a9fb17a048cf1846508234de8a7ec05008d8a1f40ce"} Apr 24 19:08:21.867802 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:21.867770 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d9958dbbd-zwwrs" event={"ID":"192ebc14-9357-4ab7-92b8-b5a005eb3693","Type":"ContainerStarted","Data":"a1ee095747bc3ccbe0094f375c95888945671476afbf4898bb3b25c908f98931"} Apr 24 19:08:21.888123 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:21.888076 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d9958dbbd-zwwrs" podStartSLOduration=1.888060498 podStartE2EDuration="1.888060498s" podCreationTimestamp="2026-04-24 19:08:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:08:21.887364333 +0000 UTC m=+139.004704722" watchObservedRunningTime="2026-04-24 19:08:21.888060498 +0000 UTC m=+139.005400885" Apr 24 19:08:30.643190 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:30.643155 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:30.643706 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:30.643244 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:30.647900 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:30.647880 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:30.899044 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:30.898959 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:08:30.948170 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:30.948139 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68c88b7b84-hgfkc"] Apr 24 19:08:55.972955 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:55.972854 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-68c88b7b84-hgfkc" podUID="b8d1cf16-8ec3-4a34-b562-403a022e73bb" containerName="console" containerID="cri-o://c969dca1747cb362729845175566ca312798aeeb018979a212d398a3ecd23036" gracePeriod=15 Apr 24 19:08:56.211527 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.211505 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68c88b7b84-hgfkc_b8d1cf16-8ec3-4a34-b562-403a022e73bb/console/0.log" Apr 24 19:08:56.211639 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.211569 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:08:56.338017 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.337992 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-console-config\") pod \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " Apr 24 19:08:56.338183 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.338026 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7hzc\" (UniqueName: \"kubernetes.io/projected/b8d1cf16-8ec3-4a34-b562-403a022e73bb-kube-api-access-d7hzc\") pod \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " Apr 24 19:08:56.338183 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.338056 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-oauth-serving-cert\") pod \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " Apr 24 19:08:56.338183 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.338075 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-service-ca\") pod \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " Apr 24 19:08:56.338183 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.338122 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-trusted-ca-bundle\") pod \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " Apr 24 19:08:56.338183 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.338140 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8d1cf16-8ec3-4a34-b562-403a022e73bb-console-oauth-config\") pod \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " Apr 24 19:08:56.338183 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.338162 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8d1cf16-8ec3-4a34-b562-403a022e73bb-console-serving-cert\") pod \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\" (UID: \"b8d1cf16-8ec3-4a34-b562-403a022e73bb\") " Apr 24 19:08:56.338515 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.338418 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b8d1cf16-8ec3-4a34-b562-403a022e73bb" (UID: "b8d1cf16-8ec3-4a34-b562-403a022e73bb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:08:56.338515 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.338426 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-console-config" (OuterVolumeSpecName: "console-config") pod "b8d1cf16-8ec3-4a34-b562-403a022e73bb" (UID: "b8d1cf16-8ec3-4a34-b562-403a022e73bb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:08:56.338628 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.338571 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b8d1cf16-8ec3-4a34-b562-403a022e73bb" (UID: "b8d1cf16-8ec3-4a34-b562-403a022e73bb"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:08:56.338976 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.338900 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-service-ca" (OuterVolumeSpecName: "service-ca") pod "b8d1cf16-8ec3-4a34-b562-403a022e73bb" (UID: "b8d1cf16-8ec3-4a34-b562-403a022e73bb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:08:56.340574 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.340545 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8d1cf16-8ec3-4a34-b562-403a022e73bb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b8d1cf16-8ec3-4a34-b562-403a022e73bb" (UID: "b8d1cf16-8ec3-4a34-b562-403a022e73bb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:08:56.340574 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.340558 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8d1cf16-8ec3-4a34-b562-403a022e73bb-kube-api-access-d7hzc" (OuterVolumeSpecName: "kube-api-access-d7hzc") pod "b8d1cf16-8ec3-4a34-b562-403a022e73bb" (UID: "b8d1cf16-8ec3-4a34-b562-403a022e73bb"). InnerVolumeSpecName "kube-api-access-d7hzc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:08:56.340719 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.340589 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8d1cf16-8ec3-4a34-b562-403a022e73bb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b8d1cf16-8ec3-4a34-b562-403a022e73bb" (UID: "b8d1cf16-8ec3-4a34-b562-403a022e73bb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:08:56.439639 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.439613 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-oauth-serving-cert\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:08:56.439639 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.439639 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-service-ca\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:08:56.439812 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.439654 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-trusted-ca-bundle\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:08:56.439812 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.439668 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8d1cf16-8ec3-4a34-b562-403a022e73bb-console-oauth-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:08:56.439812 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.439680 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8d1cf16-8ec3-4a34-b562-403a022e73bb-console-serving-cert\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:08:56.439812 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.439691 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8d1cf16-8ec3-4a34-b562-403a022e73bb-console-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:08:56.439812 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.439704 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d7hzc\" (UniqueName: \"kubernetes.io/projected/b8d1cf16-8ec3-4a34-b562-403a022e73bb-kube-api-access-d7hzc\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:08:56.967995 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.967965 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68c88b7b84-hgfkc_b8d1cf16-8ec3-4a34-b562-403a022e73bb/console/0.log" Apr 24 19:08:56.968186 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.968011 2573 generic.go:358] "Generic (PLEG): container finished" podID="b8d1cf16-8ec3-4a34-b562-403a022e73bb" containerID="c969dca1747cb362729845175566ca312798aeeb018979a212d398a3ecd23036" exitCode=2 Apr 24 19:08:56.968186 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.968072 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68c88b7b84-hgfkc" event={"ID":"b8d1cf16-8ec3-4a34-b562-403a022e73bb","Type":"ContainerDied","Data":"c969dca1747cb362729845175566ca312798aeeb018979a212d398a3ecd23036"} Apr 24 19:08:56.968186 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.968104 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68c88b7b84-hgfkc" event={"ID":"b8d1cf16-8ec3-4a34-b562-403a022e73bb","Type":"ContainerDied","Data":"b66a9b76f901656da4693767f1aa85b48070f83e0eadc1eab27e1534740a6ac7"} Apr 24 19:08:56.968186 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.968122 2573 scope.go:117] "RemoveContainer" containerID="c969dca1747cb362729845175566ca312798aeeb018979a212d398a3ecd23036" Apr 24 19:08:56.968186 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.968076 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68c88b7b84-hgfkc" Apr 24 19:08:56.977050 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.976948 2573 scope.go:117] "RemoveContainer" containerID="c969dca1747cb362729845175566ca312798aeeb018979a212d398a3ecd23036" Apr 24 19:08:56.977267 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:08:56.977184 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c969dca1747cb362729845175566ca312798aeeb018979a212d398a3ecd23036\": container with ID starting with c969dca1747cb362729845175566ca312798aeeb018979a212d398a3ecd23036 not found: ID does not exist" containerID="c969dca1747cb362729845175566ca312798aeeb018979a212d398a3ecd23036" Apr 24 19:08:56.977267 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.977211 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c969dca1747cb362729845175566ca312798aeeb018979a212d398a3ecd23036"} err="failed to get container status \"c969dca1747cb362729845175566ca312798aeeb018979a212d398a3ecd23036\": rpc error: code = NotFound desc = could not find container \"c969dca1747cb362729845175566ca312798aeeb018979a212d398a3ecd23036\": container with ID starting with c969dca1747cb362729845175566ca312798aeeb018979a212d398a3ecd23036 not found: ID does not exist" Apr 24 19:08:56.988295 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.988273 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68c88b7b84-hgfkc"] Apr 24 19:08:56.992505 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:56.992485 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68c88b7b84-hgfkc"] Apr 24 19:08:57.379489 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:08:57.379427 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8d1cf16-8ec3-4a34-b562-403a022e73bb" path="/var/lib/kubelet/pods/b8d1cf16-8ec3-4a34-b562-403a022e73bb/volumes" Apr 24 19:09:42.416496 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:09:42.416454 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d9958dbbd-zwwrs"] Apr 24 19:10:07.438884 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.438815 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6d9958dbbd-zwwrs" podUID="192ebc14-9357-4ab7-92b8-b5a005eb3693" containerName="console" containerID="cri-o://a1ee095747bc3ccbe0094f375c95888945671476afbf4898bb3b25c908f98931" gracePeriod=15 Apr 24 19:10:07.670781 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.670758 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d9958dbbd-zwwrs_192ebc14-9357-4ab7-92b8-b5a005eb3693/console/0.log" Apr 24 19:10:07.670898 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.670820 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:10:07.746849 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.746774 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pb2h\" (UniqueName: \"kubernetes.io/projected/192ebc14-9357-4ab7-92b8-b5a005eb3693-kube-api-access-4pb2h\") pod \"192ebc14-9357-4ab7-92b8-b5a005eb3693\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " Apr 24 19:10:07.746849 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.746820 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/192ebc14-9357-4ab7-92b8-b5a005eb3693-console-oauth-config\") pod \"192ebc14-9357-4ab7-92b8-b5a005eb3693\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " Apr 24 19:10:07.747056 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.746869 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-console-config\") pod \"192ebc14-9357-4ab7-92b8-b5a005eb3693\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " Apr 24 19:10:07.747056 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.746904 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-oauth-serving-cert\") pod \"192ebc14-9357-4ab7-92b8-b5a005eb3693\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " Apr 24 19:10:07.747056 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.746933 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-service-ca\") pod \"192ebc14-9357-4ab7-92b8-b5a005eb3693\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " Apr 24 19:10:07.747056 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.746967 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/192ebc14-9357-4ab7-92b8-b5a005eb3693-console-serving-cert\") pod \"192ebc14-9357-4ab7-92b8-b5a005eb3693\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " Apr 24 19:10:07.747056 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.746999 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-trusted-ca-bundle\") pod \"192ebc14-9357-4ab7-92b8-b5a005eb3693\" (UID: \"192ebc14-9357-4ab7-92b8-b5a005eb3693\") " Apr 24 19:10:07.747550 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.747320 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "192ebc14-9357-4ab7-92b8-b5a005eb3693" (UID: "192ebc14-9357-4ab7-92b8-b5a005eb3693"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:10:07.747550 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.747421 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-console-config" (OuterVolumeSpecName: "console-config") pod "192ebc14-9357-4ab7-92b8-b5a005eb3693" (UID: "192ebc14-9357-4ab7-92b8-b5a005eb3693"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:10:07.747550 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.747462 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-service-ca" (OuterVolumeSpecName: "service-ca") pod "192ebc14-9357-4ab7-92b8-b5a005eb3693" (UID: "192ebc14-9357-4ab7-92b8-b5a005eb3693"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:10:07.747768 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.747746 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "192ebc14-9357-4ab7-92b8-b5a005eb3693" (UID: "192ebc14-9357-4ab7-92b8-b5a005eb3693"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:10:07.749100 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.749078 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/192ebc14-9357-4ab7-92b8-b5a005eb3693-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "192ebc14-9357-4ab7-92b8-b5a005eb3693" (UID: "192ebc14-9357-4ab7-92b8-b5a005eb3693"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:10:07.749182 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.749126 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/192ebc14-9357-4ab7-92b8-b5a005eb3693-kube-api-access-4pb2h" (OuterVolumeSpecName: "kube-api-access-4pb2h") pod "192ebc14-9357-4ab7-92b8-b5a005eb3693" (UID: "192ebc14-9357-4ab7-92b8-b5a005eb3693"). InnerVolumeSpecName "kube-api-access-4pb2h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:10:07.749261 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.749242 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/192ebc14-9357-4ab7-92b8-b5a005eb3693-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "192ebc14-9357-4ab7-92b8-b5a005eb3693" (UID: "192ebc14-9357-4ab7-92b8-b5a005eb3693"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:10:07.848373 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.848334 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4pb2h\" (UniqueName: \"kubernetes.io/projected/192ebc14-9357-4ab7-92b8-b5a005eb3693-kube-api-access-4pb2h\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:10:07.848373 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.848367 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/192ebc14-9357-4ab7-92b8-b5a005eb3693-console-oauth-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:10:07.848373 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.848376 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-console-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:10:07.848373 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.848385 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-oauth-serving-cert\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:10:07.848609 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.848394 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-service-ca\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:10:07.848609 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.848426 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/192ebc14-9357-4ab7-92b8-b5a005eb3693-console-serving-cert\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:10:07.848609 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:07.848453 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/192ebc14-9357-4ab7-92b8-b5a005eb3693-trusted-ca-bundle\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:10:08.148581 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:08.148556 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d9958dbbd-zwwrs_192ebc14-9357-4ab7-92b8-b5a005eb3693/console/0.log" Apr 24 19:10:08.148760 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:08.148596 2573 generic.go:358] "Generic (PLEG): container finished" podID="192ebc14-9357-4ab7-92b8-b5a005eb3693" containerID="a1ee095747bc3ccbe0094f375c95888945671476afbf4898bb3b25c908f98931" exitCode=2 Apr 24 19:10:08.148760 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:08.148630 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d9958dbbd-zwwrs" event={"ID":"192ebc14-9357-4ab7-92b8-b5a005eb3693","Type":"ContainerDied","Data":"a1ee095747bc3ccbe0094f375c95888945671476afbf4898bb3b25c908f98931"} Apr 24 19:10:08.148760 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:08.148670 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d9958dbbd-zwwrs" event={"ID":"192ebc14-9357-4ab7-92b8-b5a005eb3693","Type":"ContainerDied","Data":"ee809ff1aa94d9144ea01a9fb17a048cf1846508234de8a7ec05008d8a1f40ce"} Apr 24 19:10:08.148760 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:08.148675 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d9958dbbd-zwwrs" Apr 24 19:10:08.148760 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:08.148689 2573 scope.go:117] "RemoveContainer" containerID="a1ee095747bc3ccbe0094f375c95888945671476afbf4898bb3b25c908f98931" Apr 24 19:10:08.157988 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:08.157868 2573 scope.go:117] "RemoveContainer" containerID="a1ee095747bc3ccbe0094f375c95888945671476afbf4898bb3b25c908f98931" Apr 24 19:10:08.158192 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:10:08.158155 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1ee095747bc3ccbe0094f375c95888945671476afbf4898bb3b25c908f98931\": container with ID starting with a1ee095747bc3ccbe0094f375c95888945671476afbf4898bb3b25c908f98931 not found: ID does not exist" containerID="a1ee095747bc3ccbe0094f375c95888945671476afbf4898bb3b25c908f98931" Apr 24 19:10:08.158271 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:08.158197 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1ee095747bc3ccbe0094f375c95888945671476afbf4898bb3b25c908f98931"} err="failed to get container status \"a1ee095747bc3ccbe0094f375c95888945671476afbf4898bb3b25c908f98931\": rpc error: code = NotFound desc = could not find container \"a1ee095747bc3ccbe0094f375c95888945671476afbf4898bb3b25c908f98931\": container with ID starting with a1ee095747bc3ccbe0094f375c95888945671476afbf4898bb3b25c908f98931 not found: ID does not exist" Apr 24 19:10:08.171779 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:08.171760 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d9958dbbd-zwwrs"] Apr 24 19:10:08.175358 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:08.175332 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6d9958dbbd-zwwrs"] Apr 24 19:10:09.379153 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:09.379120 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="192ebc14-9357-4ab7-92b8-b5a005eb3693" path="/var/lib/kubelet/pods/192ebc14-9357-4ab7-92b8-b5a005eb3693/volumes" Apr 24 19:10:24.071905 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.071810 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v"] Apr 24 19:10:24.072373 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.072160 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8d1cf16-8ec3-4a34-b562-403a022e73bb" containerName="console" Apr 24 19:10:24.072373 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.072174 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d1cf16-8ec3-4a34-b562-403a022e73bb" containerName="console" Apr 24 19:10:24.072373 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.072191 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="192ebc14-9357-4ab7-92b8-b5a005eb3693" containerName="console" Apr 24 19:10:24.072373 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.072197 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="192ebc14-9357-4ab7-92b8-b5a005eb3693" containerName="console" Apr 24 19:10:24.072373 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.072232 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8d1cf16-8ec3-4a34-b562-403a022e73bb" containerName="console" Apr 24 19:10:24.072373 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.072239 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="192ebc14-9357-4ab7-92b8-b5a005eb3693" containerName="console" Apr 24 19:10:24.075037 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.075022 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v" Apr 24 19:10:24.077517 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.077492 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 19:10:24.077637 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.077510 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 19:10:24.078507 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.078490 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5r7hq\"" Apr 24 19:10:24.087563 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.087543 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v"] Apr 24 19:10:24.168409 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.168377 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68z26\" (UniqueName: \"kubernetes.io/projected/046a696a-59da-45b8-8736-9ecca531b5d5-kube-api-access-68z26\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v\" (UID: \"046a696a-59da-45b8-8736-9ecca531b5d5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v" Apr 24 19:10:24.168409 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.168412 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/046a696a-59da-45b8-8736-9ecca531b5d5-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v\" (UID: \"046a696a-59da-45b8-8736-9ecca531b5d5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v" Apr 24 19:10:24.168619 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.168531 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/046a696a-59da-45b8-8736-9ecca531b5d5-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v\" (UID: \"046a696a-59da-45b8-8736-9ecca531b5d5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v" Apr 24 19:10:24.269197 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.269166 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/046a696a-59da-45b8-8736-9ecca531b5d5-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v\" (UID: \"046a696a-59da-45b8-8736-9ecca531b5d5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v" Apr 24 19:10:24.269346 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.269228 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68z26\" (UniqueName: \"kubernetes.io/projected/046a696a-59da-45b8-8736-9ecca531b5d5-kube-api-access-68z26\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v\" (UID: \"046a696a-59da-45b8-8736-9ecca531b5d5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v" Apr 24 19:10:24.269346 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.269246 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/046a696a-59da-45b8-8736-9ecca531b5d5-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v\" (UID: \"046a696a-59da-45b8-8736-9ecca531b5d5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v" Apr 24 19:10:24.269593 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.269573 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/046a696a-59da-45b8-8736-9ecca531b5d5-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v\" (UID: \"046a696a-59da-45b8-8736-9ecca531b5d5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v" Apr 24 19:10:24.269639 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.269591 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/046a696a-59da-45b8-8736-9ecca531b5d5-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v\" (UID: \"046a696a-59da-45b8-8736-9ecca531b5d5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v" Apr 24 19:10:24.278448 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.278411 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68z26\" (UniqueName: \"kubernetes.io/projected/046a696a-59da-45b8-8736-9ecca531b5d5-kube-api-access-68z26\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v\" (UID: \"046a696a-59da-45b8-8736-9ecca531b5d5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v" Apr 24 19:10:24.383406 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.383329 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v" Apr 24 19:10:24.499636 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:24.499502 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v"] Apr 24 19:10:24.502113 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:10:24.502088 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod046a696a_59da_45b8_8736_9ecca531b5d5.slice/crio-bcf23a5da35c797e542bd8b63ab14fdda3d11b0c1f8012ac2472e567d1b6c2b7 WatchSource:0}: Error finding container bcf23a5da35c797e542bd8b63ab14fdda3d11b0c1f8012ac2472e567d1b6c2b7: Status 404 returned error can't find the container with id bcf23a5da35c797e542bd8b63ab14fdda3d11b0c1f8012ac2472e567d1b6c2b7 Apr 24 19:10:25.192020 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:25.191979 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v" event={"ID":"046a696a-59da-45b8-8736-9ecca531b5d5","Type":"ContainerStarted","Data":"bcf23a5da35c797e542bd8b63ab14fdda3d11b0c1f8012ac2472e567d1b6c2b7"} Apr 24 19:10:33.216696 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:33.216659 2573 generic.go:358] "Generic (PLEG): container finished" podID="046a696a-59da-45b8-8736-9ecca531b5d5" containerID="09c7b627eeaab8ff0597caebe9c0bfd13d57ef12f1ded49181ac7872a5415dcd" exitCode=0 Apr 24 19:10:33.217111 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:33.216751 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v" event={"ID":"046a696a-59da-45b8-8736-9ecca531b5d5","Type":"ContainerDied","Data":"09c7b627eeaab8ff0597caebe9c0bfd13d57ef12f1ded49181ac7872a5415dcd"} Apr 24 19:10:36.225907 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:36.225870 2573 generic.go:358] "Generic (PLEG): container finished" podID="046a696a-59da-45b8-8736-9ecca531b5d5" containerID="f61e4d770f6b2e08bc8c85890928432cda4e148ce8552f5e94d69a16b10de078" exitCode=0 Apr 24 19:10:36.226368 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:36.225914 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v" event={"ID":"046a696a-59da-45b8-8736-9ecca531b5d5","Type":"ContainerDied","Data":"f61e4d770f6b2e08bc8c85890928432cda4e148ce8552f5e94d69a16b10de078"} Apr 24 19:10:39.451221 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.451182 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b97f7f78-9bp97"] Apr 24 19:10:39.455061 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.455028 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b97f7f78-9bp97" Apr 24 19:10:39.457711 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.457681 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 19:10:39.458774 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.458753 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 19:10:39.458911 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.458779 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 19:10:39.458911 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.458784 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 19:10:39.462371 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.462348 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b97f7f78-9bp97"] Apr 24 19:10:39.488876 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.488840 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6fv2\" (UniqueName: \"kubernetes.io/projected/434ec83e-48ff-4d1b-94f7-5b113b021c3b-kube-api-access-n6fv2\") pod \"klusterlet-addon-workmgr-b97f7f78-9bp97\" (UID: \"434ec83e-48ff-4d1b-94f7-5b113b021c3b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b97f7f78-9bp97" Apr 24 19:10:39.489058 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.488904 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/434ec83e-48ff-4d1b-94f7-5b113b021c3b-klusterlet-config\") pod \"klusterlet-addon-workmgr-b97f7f78-9bp97\" (UID: \"434ec83e-48ff-4d1b-94f7-5b113b021c3b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b97f7f78-9bp97" Apr 24 19:10:39.489058 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.488938 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/434ec83e-48ff-4d1b-94f7-5b113b021c3b-tmp\") pod \"klusterlet-addon-workmgr-b97f7f78-9bp97\" (UID: \"434ec83e-48ff-4d1b-94f7-5b113b021c3b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b97f7f78-9bp97" Apr 24 19:10:39.491986 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.491955 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft"] Apr 24 19:10:39.496135 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.496112 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" Apr 24 19:10:39.498910 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.498881 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 19:10:39.498910 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.498905 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 19:10:39.499055 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.498905 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 19:10:39.499150 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.499133 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 19:10:39.504922 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.504900 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft"] Apr 24 19:10:39.589315 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.589262 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6fv2\" (UniqueName: \"kubernetes.io/projected/434ec83e-48ff-4d1b-94f7-5b113b021c3b-kube-api-access-n6fv2\") pod \"klusterlet-addon-workmgr-b97f7f78-9bp97\" (UID: \"434ec83e-48ff-4d1b-94f7-5b113b021c3b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b97f7f78-9bp97" Apr 24 19:10:39.589536 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.589337 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d61ee695-9977-4eb8-b901-bc1c644df473-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-56ff5644b-54zft\" (UID: \"d61ee695-9977-4eb8-b901-bc1c644df473\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" Apr 24 19:10:39.589536 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.589375 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/434ec83e-48ff-4d1b-94f7-5b113b021c3b-klusterlet-config\") pod \"klusterlet-addon-workmgr-b97f7f78-9bp97\" (UID: \"434ec83e-48ff-4d1b-94f7-5b113b021c3b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b97f7f78-9bp97" Apr 24 19:10:39.589536 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.589410 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/434ec83e-48ff-4d1b-94f7-5b113b021c3b-tmp\") pod \"klusterlet-addon-workmgr-b97f7f78-9bp97\" (UID: \"434ec83e-48ff-4d1b-94f7-5b113b021c3b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b97f7f78-9bp97" Apr 24 19:10:39.589536 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.589480 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d61ee695-9977-4eb8-b901-bc1c644df473-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-56ff5644b-54zft\" (UID: \"d61ee695-9977-4eb8-b901-bc1c644df473\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" Apr 24 19:10:39.589536 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.589523 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d61ee695-9977-4eb8-b901-bc1c644df473-ca\") pod \"cluster-proxy-proxy-agent-56ff5644b-54zft\" (UID: \"d61ee695-9977-4eb8-b901-bc1c644df473\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" Apr 24 19:10:39.589818 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.589545 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d61ee695-9977-4eb8-b901-bc1c644df473-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-56ff5644b-54zft\" (UID: \"d61ee695-9977-4eb8-b901-bc1c644df473\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" Apr 24 19:10:39.589818 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.589568 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbps8\" (UniqueName: \"kubernetes.io/projected/d61ee695-9977-4eb8-b901-bc1c644df473-kube-api-access-tbps8\") pod \"cluster-proxy-proxy-agent-56ff5644b-54zft\" (UID: \"d61ee695-9977-4eb8-b901-bc1c644df473\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" Apr 24 19:10:39.589818 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.589610 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d61ee695-9977-4eb8-b901-bc1c644df473-hub\") pod \"cluster-proxy-proxy-agent-56ff5644b-54zft\" (UID: \"d61ee695-9977-4eb8-b901-bc1c644df473\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" Apr 24 19:10:39.590013 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.589821 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/434ec83e-48ff-4d1b-94f7-5b113b021c3b-tmp\") pod \"klusterlet-addon-workmgr-b97f7f78-9bp97\" (UID: \"434ec83e-48ff-4d1b-94f7-5b113b021c3b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b97f7f78-9bp97" Apr 24 19:10:39.592313 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.592291 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/434ec83e-48ff-4d1b-94f7-5b113b021c3b-klusterlet-config\") pod \"klusterlet-addon-workmgr-b97f7f78-9bp97\" (UID: \"434ec83e-48ff-4d1b-94f7-5b113b021c3b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b97f7f78-9bp97" Apr 24 19:10:39.597527 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.597504 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6fv2\" (UniqueName: \"kubernetes.io/projected/434ec83e-48ff-4d1b-94f7-5b113b021c3b-kube-api-access-n6fv2\") pod \"klusterlet-addon-workmgr-b97f7f78-9bp97\" (UID: \"434ec83e-48ff-4d1b-94f7-5b113b021c3b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b97f7f78-9bp97" Apr 24 19:10:39.690626 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.690586 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d61ee695-9977-4eb8-b901-bc1c644df473-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-56ff5644b-54zft\" (UID: \"d61ee695-9977-4eb8-b901-bc1c644df473\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" Apr 24 19:10:39.690794 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.690632 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d61ee695-9977-4eb8-b901-bc1c644df473-ca\") pod \"cluster-proxy-proxy-agent-56ff5644b-54zft\" (UID: \"d61ee695-9977-4eb8-b901-bc1c644df473\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" Apr 24 19:10:39.690794 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.690657 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d61ee695-9977-4eb8-b901-bc1c644df473-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-56ff5644b-54zft\" (UID: \"d61ee695-9977-4eb8-b901-bc1c644df473\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" Apr 24 19:10:39.690794 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.690783 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbps8\" (UniqueName: \"kubernetes.io/projected/d61ee695-9977-4eb8-b901-bc1c644df473-kube-api-access-tbps8\") pod \"cluster-proxy-proxy-agent-56ff5644b-54zft\" (UID: \"d61ee695-9977-4eb8-b901-bc1c644df473\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" Apr 24 19:10:39.690968 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.690845 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d61ee695-9977-4eb8-b901-bc1c644df473-hub\") pod \"cluster-proxy-proxy-agent-56ff5644b-54zft\" (UID: \"d61ee695-9977-4eb8-b901-bc1c644df473\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" Apr 24 19:10:39.690968 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.690914 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d61ee695-9977-4eb8-b901-bc1c644df473-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-56ff5644b-54zft\" (UID: \"d61ee695-9977-4eb8-b901-bc1c644df473\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" Apr 24 19:10:39.691569 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.691527 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d61ee695-9977-4eb8-b901-bc1c644df473-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-56ff5644b-54zft\" (UID: \"d61ee695-9977-4eb8-b901-bc1c644df473\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" Apr 24 19:10:39.693293 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.693267 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d61ee695-9977-4eb8-b901-bc1c644df473-ca\") pod \"cluster-proxy-proxy-agent-56ff5644b-54zft\" (UID: \"d61ee695-9977-4eb8-b901-bc1c644df473\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" Apr 24 19:10:39.693293 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.693289 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d61ee695-9977-4eb8-b901-bc1c644df473-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-56ff5644b-54zft\" (UID: \"d61ee695-9977-4eb8-b901-bc1c644df473\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" Apr 24 19:10:39.693704 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.693684 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d61ee695-9977-4eb8-b901-bc1c644df473-hub\") pod \"cluster-proxy-proxy-agent-56ff5644b-54zft\" (UID: \"d61ee695-9977-4eb8-b901-bc1c644df473\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" Apr 24 19:10:39.693770 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.693700 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d61ee695-9977-4eb8-b901-bc1c644df473-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-56ff5644b-54zft\" (UID: \"d61ee695-9977-4eb8-b901-bc1c644df473\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" Apr 24 19:10:39.698727 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.698704 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbps8\" (UniqueName: \"kubernetes.io/projected/d61ee695-9977-4eb8-b901-bc1c644df473-kube-api-access-tbps8\") pod \"cluster-proxy-proxy-agent-56ff5644b-54zft\" (UID: \"d61ee695-9977-4eb8-b901-bc1c644df473\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" Apr 24 19:10:39.766458 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.766422 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b97f7f78-9bp97" Apr 24 19:10:39.816743 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.816579 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" Apr 24 19:10:39.897167 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.897063 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b97f7f78-9bp97"] Apr 24 19:10:39.899771 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:10:39.899728 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod434ec83e_48ff_4d1b_94f7_5b113b021c3b.slice/crio-bced134323b1c61fc47a481205e8afe6277d7e660443fdcc64576a9ce621be3d WatchSource:0}: Error finding container bced134323b1c61fc47a481205e8afe6277d7e660443fdcc64576a9ce621be3d: Status 404 returned error can't find the container with id bced134323b1c61fc47a481205e8afe6277d7e660443fdcc64576a9ce621be3d Apr 24 19:10:39.948190 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:39.948165 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft"] Apr 24 19:10:39.950670 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:10:39.950645 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd61ee695_9977_4eb8_b901_bc1c644df473.slice/crio-906216e6ff83358497a9cc56a9eb559994bf7e663405ea8c9d349bae894892cf WatchSource:0}: Error finding container 906216e6ff83358497a9cc56a9eb559994bf7e663405ea8c9d349bae894892cf: Status 404 returned error can't find the container with id 906216e6ff83358497a9cc56a9eb559994bf7e663405ea8c9d349bae894892cf Apr 24 19:10:40.236911 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:40.236830 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" event={"ID":"d61ee695-9977-4eb8-b901-bc1c644df473","Type":"ContainerStarted","Data":"906216e6ff83358497a9cc56a9eb559994bf7e663405ea8c9d349bae894892cf"} Apr 24 19:10:40.237758 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:40.237730 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b97f7f78-9bp97" event={"ID":"434ec83e-48ff-4d1b-94f7-5b113b021c3b","Type":"ContainerStarted","Data":"bced134323b1c61fc47a481205e8afe6277d7e660443fdcc64576a9ce621be3d"} Apr 24 19:10:48.264447 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:48.264400 2573 generic.go:358] "Generic (PLEG): container finished" podID="046a696a-59da-45b8-8736-9ecca531b5d5" containerID="ecd94025b7573c037f9cf10998f58d2deecce7c7f50b02d031ed375e4e37887a" exitCode=0 Apr 24 19:10:48.264906 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:48.264460 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v" event={"ID":"046a696a-59da-45b8-8736-9ecca531b5d5","Type":"ContainerDied","Data":"ecd94025b7573c037f9cf10998f58d2deecce7c7f50b02d031ed375e4e37887a"} Apr 24 19:10:48.265828 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:48.265805 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" event={"ID":"d61ee695-9977-4eb8-b901-bc1c644df473","Type":"ContainerStarted","Data":"0601ef3970c537a4d22aa3164fd2b23b90dcc5165503b6e0008952c646134a9b"} Apr 24 19:10:48.266970 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:48.266944 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b97f7f78-9bp97" event={"ID":"434ec83e-48ff-4d1b-94f7-5b113b021c3b","Type":"ContainerStarted","Data":"8357bcc20ecea951c676ec182f9384dfed5d158e77a2901a7fbd1eddf12ffb58"} Apr 24 19:10:48.267156 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:48.267138 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b97f7f78-9bp97" Apr 24 19:10:48.268933 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:48.268918 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b97f7f78-9bp97" Apr 24 19:10:48.296476 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:48.296410 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b97f7f78-9bp97" podStartSLOduration=1.530126306 podStartE2EDuration="9.296398178s" podCreationTimestamp="2026-04-24 19:10:39 +0000 UTC" firstStartedPulling="2026-04-24 19:10:39.901389183 +0000 UTC m=+277.018729549" lastFinishedPulling="2026-04-24 19:10:47.667661042 +0000 UTC m=+284.785001421" observedRunningTime="2026-04-24 19:10:48.296059967 +0000 UTC m=+285.413400356" watchObservedRunningTime="2026-04-24 19:10:48.296398178 +0000 UTC m=+285.413738565" Apr 24 19:10:49.713578 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:49.713549 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v" Apr 24 19:10:49.777800 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:49.777781 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68z26\" (UniqueName: \"kubernetes.io/projected/046a696a-59da-45b8-8736-9ecca531b5d5-kube-api-access-68z26\") pod \"046a696a-59da-45b8-8736-9ecca531b5d5\" (UID: \"046a696a-59da-45b8-8736-9ecca531b5d5\") " Apr 24 19:10:49.777871 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:49.777829 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/046a696a-59da-45b8-8736-9ecca531b5d5-util\") pod \"046a696a-59da-45b8-8736-9ecca531b5d5\" (UID: \"046a696a-59da-45b8-8736-9ecca531b5d5\") " Apr 24 19:10:49.777871 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:49.777855 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/046a696a-59da-45b8-8736-9ecca531b5d5-bundle\") pod \"046a696a-59da-45b8-8736-9ecca531b5d5\" (UID: \"046a696a-59da-45b8-8736-9ecca531b5d5\") " Apr 24 19:10:49.778411 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:49.778385 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/046a696a-59da-45b8-8736-9ecca531b5d5-bundle" (OuterVolumeSpecName: "bundle") pod "046a696a-59da-45b8-8736-9ecca531b5d5" (UID: "046a696a-59da-45b8-8736-9ecca531b5d5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:10:49.780099 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:49.780072 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/046a696a-59da-45b8-8736-9ecca531b5d5-kube-api-access-68z26" (OuterVolumeSpecName: "kube-api-access-68z26") pod "046a696a-59da-45b8-8736-9ecca531b5d5" (UID: "046a696a-59da-45b8-8736-9ecca531b5d5"). InnerVolumeSpecName "kube-api-access-68z26". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:10:49.783446 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:49.783406 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/046a696a-59da-45b8-8736-9ecca531b5d5-util" (OuterVolumeSpecName: "util") pod "046a696a-59da-45b8-8736-9ecca531b5d5" (UID: "046a696a-59da-45b8-8736-9ecca531b5d5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:10:49.878750 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:49.878731 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/046a696a-59da-45b8-8736-9ecca531b5d5-bundle\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:10:49.878750 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:49.878751 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-68z26\" (UniqueName: \"kubernetes.io/projected/046a696a-59da-45b8-8736-9ecca531b5d5-kube-api-access-68z26\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:10:49.878926 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:49.878761 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/046a696a-59da-45b8-8736-9ecca531b5d5-util\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:10:50.273666 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:50.273630 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v" event={"ID":"046a696a-59da-45b8-8736-9ecca531b5d5","Type":"ContainerDied","Data":"bcf23a5da35c797e542bd8b63ab14fdda3d11b0c1f8012ac2472e567d1b6c2b7"} Apr 24 19:10:50.273666 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:50.273655 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vc6v" Apr 24 19:10:50.273666 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:50.273663 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcf23a5da35c797e542bd8b63ab14fdda3d11b0c1f8012ac2472e567d1b6c2b7" Apr 24 19:10:50.275973 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:50.275942 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" event={"ID":"d61ee695-9977-4eb8-b901-bc1c644df473","Type":"ContainerStarted","Data":"d168eb4febd817652c4e44ff5796d6619fa300d85484d31984fbad1a0c435087"} Apr 24 19:10:50.276095 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:50.275981 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" event={"ID":"d61ee695-9977-4eb8-b901-bc1c644df473","Type":"ContainerStarted","Data":"4d1f3794a9929671974be3c85b8daadbdef2d07dcd3d53e73ae1e9609feb53f5"} Apr 24 19:10:50.295632 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:10:50.295590 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ff5644b-54zft" podStartSLOduration=1.480668956 podStartE2EDuration="11.295577829s" podCreationTimestamp="2026-04-24 19:10:39 +0000 UTC" firstStartedPulling="2026-04-24 19:10:39.952291824 +0000 UTC m=+277.069632192" lastFinishedPulling="2026-04-24 19:10:49.7672007 +0000 UTC m=+286.884541065" observedRunningTime="2026-04-24 19:10:50.29429279 +0000 UTC m=+287.411633179" watchObservedRunningTime="2026-04-24 19:10:50.295577829 +0000 UTC m=+287.412918271" Apr 24 19:11:03.331828 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:11:03.331807 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 19:13:15.801316 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:15.801283 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-9ct4v"] Apr 24 19:13:15.801782 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:15.801566 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="046a696a-59da-45b8-8736-9ecca531b5d5" containerName="extract" Apr 24 19:13:15.801782 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:15.801578 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="046a696a-59da-45b8-8736-9ecca531b5d5" containerName="extract" Apr 24 19:13:15.801782 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:15.801602 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="046a696a-59da-45b8-8736-9ecca531b5d5" containerName="util" Apr 24 19:13:15.801782 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:15.801611 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="046a696a-59da-45b8-8736-9ecca531b5d5" containerName="util" Apr 24 19:13:15.801782 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:15.801621 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="046a696a-59da-45b8-8736-9ecca531b5d5" containerName="pull" Apr 24 19:13:15.801782 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:15.801627 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="046a696a-59da-45b8-8736-9ecca531b5d5" containerName="pull" Apr 24 19:13:15.801782 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:15.801670 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="046a696a-59da-45b8-8736-9ecca531b5d5" containerName="extract" Apr 24 19:13:15.804413 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:15.804398 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-9ct4v" Apr 24 19:13:15.806917 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:15.806891 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 19:13:15.807197 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:15.807176 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 19:13:15.807298 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:15.807185 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-dxw5t\"" Apr 24 19:13:15.808132 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:15.808110 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 19:13:15.813816 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:15.813792 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-9ct4v"] Apr 24 19:13:15.827456 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:15.827409 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/128f5402-7d31-4587-a80a-9e48a31b06ca-tls-certs\") pod \"model-serving-api-86f7b4b499-9ct4v\" (UID: \"128f5402-7d31-4587-a80a-9e48a31b06ca\") " pod="kserve/model-serving-api-86f7b4b499-9ct4v" Apr 24 19:13:15.827562 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:15.827479 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmftz\" (UniqueName: \"kubernetes.io/projected/128f5402-7d31-4587-a80a-9e48a31b06ca-kube-api-access-gmftz\") pod \"model-serving-api-86f7b4b499-9ct4v\" (UID: \"128f5402-7d31-4587-a80a-9e48a31b06ca\") " pod="kserve/model-serving-api-86f7b4b499-9ct4v" Apr 24 19:13:15.928377 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:15.928346 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/128f5402-7d31-4587-a80a-9e48a31b06ca-tls-certs\") pod \"model-serving-api-86f7b4b499-9ct4v\" (UID: \"128f5402-7d31-4587-a80a-9e48a31b06ca\") " pod="kserve/model-serving-api-86f7b4b499-9ct4v" Apr 24 19:13:15.928377 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:15.928389 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmftz\" (UniqueName: \"kubernetes.io/projected/128f5402-7d31-4587-a80a-9e48a31b06ca-kube-api-access-gmftz\") pod \"model-serving-api-86f7b4b499-9ct4v\" (UID: \"128f5402-7d31-4587-a80a-9e48a31b06ca\") " pod="kserve/model-serving-api-86f7b4b499-9ct4v" Apr 24 19:13:15.928595 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:13:15.928508 2573 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 24 19:13:15.928896 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:13:15.928874 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/128f5402-7d31-4587-a80a-9e48a31b06ca-tls-certs podName:128f5402-7d31-4587-a80a-9e48a31b06ca nodeName:}" failed. No retries permitted until 2026-04-24 19:13:16.428849 +0000 UTC m=+433.546189376 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/128f5402-7d31-4587-a80a-9e48a31b06ca-tls-certs") pod "model-serving-api-86f7b4b499-9ct4v" (UID: "128f5402-7d31-4587-a80a-9e48a31b06ca") : secret "model-serving-api-tls" not found Apr 24 19:13:15.937077 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:15.937044 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmftz\" (UniqueName: \"kubernetes.io/projected/128f5402-7d31-4587-a80a-9e48a31b06ca-kube-api-access-gmftz\") pod \"model-serving-api-86f7b4b499-9ct4v\" (UID: \"128f5402-7d31-4587-a80a-9e48a31b06ca\") " pod="kserve/model-serving-api-86f7b4b499-9ct4v" Apr 24 19:13:16.430618 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:16.430579 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/128f5402-7d31-4587-a80a-9e48a31b06ca-tls-certs\") pod \"model-serving-api-86f7b4b499-9ct4v\" (UID: \"128f5402-7d31-4587-a80a-9e48a31b06ca\") " pod="kserve/model-serving-api-86f7b4b499-9ct4v" Apr 24 19:13:16.433102 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:16.433071 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/128f5402-7d31-4587-a80a-9e48a31b06ca-tls-certs\") pod \"model-serving-api-86f7b4b499-9ct4v\" (UID: \"128f5402-7d31-4587-a80a-9e48a31b06ca\") " pod="kserve/model-serving-api-86f7b4b499-9ct4v" Apr 24 19:13:16.716091 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:16.716010 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-9ct4v" Apr 24 19:13:16.839667 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:16.839644 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-9ct4v"] Apr 24 19:13:16.841378 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:13:16.841346 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod128f5402_7d31_4587_a80a_9e48a31b06ca.slice/crio-b9f3ea47f090d374c6294308933c703697f8ef65181a42bd6bcdd13df754cd32 WatchSource:0}: Error finding container b9f3ea47f090d374c6294308933c703697f8ef65181a42bd6bcdd13df754cd32: Status 404 returned error can't find the container with id b9f3ea47f090d374c6294308933c703697f8ef65181a42bd6bcdd13df754cd32 Apr 24 19:13:16.843109 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:16.843092 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:13:17.664128 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:17.664076 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-9ct4v" event={"ID":"128f5402-7d31-4587-a80a-9e48a31b06ca","Type":"ContainerStarted","Data":"b9f3ea47f090d374c6294308933c703697f8ef65181a42bd6bcdd13df754cd32"} Apr 24 19:13:19.671503 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:19.671461 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-9ct4v" event={"ID":"128f5402-7d31-4587-a80a-9e48a31b06ca","Type":"ContainerStarted","Data":"bdf6c51eb746b1f3007f3f2acc3dfcf89b015ccb9e9a33ab8fad8889e2633d24"} Apr 24 19:13:19.671860 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:19.671514 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-9ct4v" Apr 24 19:13:19.687970 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:19.687921 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-9ct4v" podStartSLOduration=2.367655434 podStartE2EDuration="4.687879559s" podCreationTimestamp="2026-04-24 19:13:15 +0000 UTC" firstStartedPulling="2026-04-24 19:13:16.843220413 +0000 UTC m=+433.960560780" lastFinishedPulling="2026-04-24 19:13:19.163444524 +0000 UTC m=+436.280784905" observedRunningTime="2026-04-24 19:13:19.685965475 +0000 UTC m=+436.803305863" watchObservedRunningTime="2026-04-24 19:13:19.687879559 +0000 UTC m=+436.805219946" Apr 24 19:13:30.680261 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:30.680230 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-9ct4v" Apr 24 19:13:51.873216 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:51.873184 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx"] Apr 24 19:13:51.876413 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:51.876397 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" Apr 24 19:13:51.878834 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:51.878808 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 19:13:51.879933 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:51.879916 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 19:13:51.880047 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:51.879916 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-rsc6c\"" Apr 24 19:13:51.880047 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:51.879955 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-48e5e-predictor-serving-cert\"" Apr 24 19:13:51.880047 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:51.879977 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-48e5e-kube-rbac-proxy-sar-config\"" Apr 24 19:13:51.885862 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:51.885844 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx"] Apr 24 19:13:51.978974 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:51.978944 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-48e5e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/03f34821-7441-4348-8dcf-d4abd092e54b-success-200-isvc-48e5e-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx\" (UID: \"03f34821-7441-4348-8dcf-d4abd092e54b\") " pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" Apr 24 19:13:51.979138 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:51.978998 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/03f34821-7441-4348-8dcf-d4abd092e54b-proxy-tls\") pod \"success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx\" (UID: \"03f34821-7441-4348-8dcf-d4abd092e54b\") " pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" Apr 24 19:13:51.979138 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:51.979072 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7stv\" (UniqueName: \"kubernetes.io/projected/03f34821-7441-4348-8dcf-d4abd092e54b-kube-api-access-k7stv\") pod \"success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx\" (UID: \"03f34821-7441-4348-8dcf-d4abd092e54b\") " pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" Apr 24 19:13:52.079798 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.079757 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/03f34821-7441-4348-8dcf-d4abd092e54b-proxy-tls\") pod \"success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx\" (UID: \"03f34821-7441-4348-8dcf-d4abd092e54b\") " pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" Apr 24 19:13:52.079981 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.079810 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7stv\" (UniqueName: \"kubernetes.io/projected/03f34821-7441-4348-8dcf-d4abd092e54b-kube-api-access-k7stv\") pod \"success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx\" (UID: \"03f34821-7441-4348-8dcf-d4abd092e54b\") " pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" Apr 24 19:13:52.079981 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.079840 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-48e5e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/03f34821-7441-4348-8dcf-d4abd092e54b-success-200-isvc-48e5e-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx\" (UID: \"03f34821-7441-4348-8dcf-d4abd092e54b\") " pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" Apr 24 19:13:52.079981 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:13:52.079895 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-serving-cert: secret "success-200-isvc-48e5e-predictor-serving-cert" not found Apr 24 19:13:52.079981 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:13:52.079973 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03f34821-7441-4348-8dcf-d4abd092e54b-proxy-tls podName:03f34821-7441-4348-8dcf-d4abd092e54b nodeName:}" failed. No retries permitted until 2026-04-24 19:13:52.579951647 +0000 UTC m=+469.697292016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/03f34821-7441-4348-8dcf-d4abd092e54b-proxy-tls") pod "success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" (UID: "03f34821-7441-4348-8dcf-d4abd092e54b") : secret "success-200-isvc-48e5e-predictor-serving-cert" not found Apr 24 19:13:52.080407 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.080389 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-48e5e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/03f34821-7441-4348-8dcf-d4abd092e54b-success-200-isvc-48e5e-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx\" (UID: \"03f34821-7441-4348-8dcf-d4abd092e54b\") " pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" Apr 24 19:13:52.090534 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.090512 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7stv\" (UniqueName: \"kubernetes.io/projected/03f34821-7441-4348-8dcf-d4abd092e54b-kube-api-access-k7stv\") pod \"success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx\" (UID: \"03f34821-7441-4348-8dcf-d4abd092e54b\") " pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" Apr 24 19:13:52.583833 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.583799 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/03f34821-7441-4348-8dcf-d4abd092e54b-proxy-tls\") pod \"success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx\" (UID: \"03f34821-7441-4348-8dcf-d4abd092e54b\") " pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" Apr 24 19:13:52.586224 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.586195 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/03f34821-7441-4348-8dcf-d4abd092e54b-proxy-tls\") pod \"success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx\" (UID: \"03f34821-7441-4348-8dcf-d4abd092e54b\") " pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" Apr 24 19:13:52.740378 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.740346 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx"] Apr 24 19:13:52.743549 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.743533 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" Apr 24 19:13:52.745941 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.745920 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-48e5e-predictor-serving-cert\"" Apr 24 19:13:52.745941 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.745924 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-48e5e-kube-rbac-proxy-sar-config\"" Apr 24 19:13:52.752688 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.752661 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx"] Apr 24 19:13:52.785902 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.785874 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/457a2da5-4584-4b53-9c42-5fb9d6640940-proxy-tls\") pod \"error-404-isvc-48e5e-predictor-5fd594f674-4dgzx\" (UID: \"457a2da5-4584-4b53-9c42-5fb9d6640940\") " pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" Apr 24 19:13:52.786055 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.785909 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-48e5e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/457a2da5-4584-4b53-9c42-5fb9d6640940-error-404-isvc-48e5e-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-48e5e-predictor-5fd594f674-4dgzx\" (UID: \"457a2da5-4584-4b53-9c42-5fb9d6640940\") " pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" Apr 24 19:13:52.786055 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.785971 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dbvh\" (UniqueName: \"kubernetes.io/projected/457a2da5-4584-4b53-9c42-5fb9d6640940-kube-api-access-4dbvh\") pod \"error-404-isvc-48e5e-predictor-5fd594f674-4dgzx\" (UID: \"457a2da5-4584-4b53-9c42-5fb9d6640940\") " pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" Apr 24 19:13:52.789827 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.789808 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" Apr 24 19:13:52.886911 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.886876 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/457a2da5-4584-4b53-9c42-5fb9d6640940-proxy-tls\") pod \"error-404-isvc-48e5e-predictor-5fd594f674-4dgzx\" (UID: \"457a2da5-4584-4b53-9c42-5fb9d6640940\") " pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" Apr 24 19:13:52.887304 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.886922 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-48e5e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/457a2da5-4584-4b53-9c42-5fb9d6640940-error-404-isvc-48e5e-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-48e5e-predictor-5fd594f674-4dgzx\" (UID: \"457a2da5-4584-4b53-9c42-5fb9d6640940\") " pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" Apr 24 19:13:52.887304 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.886958 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dbvh\" (UniqueName: \"kubernetes.io/projected/457a2da5-4584-4b53-9c42-5fb9d6640940-kube-api-access-4dbvh\") pod \"error-404-isvc-48e5e-predictor-5fd594f674-4dgzx\" (UID: \"457a2da5-4584-4b53-9c42-5fb9d6640940\") " pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" Apr 24 19:13:52.887728 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.887701 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-48e5e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/457a2da5-4584-4b53-9c42-5fb9d6640940-error-404-isvc-48e5e-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-48e5e-predictor-5fd594f674-4dgzx\" (UID: \"457a2da5-4584-4b53-9c42-5fb9d6640940\") " pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" Apr 24 19:13:52.889583 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.889553 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/457a2da5-4584-4b53-9c42-5fb9d6640940-proxy-tls\") pod \"error-404-isvc-48e5e-predictor-5fd594f674-4dgzx\" (UID: \"457a2da5-4584-4b53-9c42-5fb9d6640940\") " pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" Apr 24 19:13:52.894849 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.894827 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dbvh\" (UniqueName: \"kubernetes.io/projected/457a2da5-4584-4b53-9c42-5fb9d6640940-kube-api-access-4dbvh\") pod \"error-404-isvc-48e5e-predictor-5fd594f674-4dgzx\" (UID: \"457a2da5-4584-4b53-9c42-5fb9d6640940\") " pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" Apr 24 19:13:52.911635 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:52.911615 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx"] Apr 24 19:13:52.913585 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:13:52.913563 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03f34821_7441_4348_8dcf_d4abd092e54b.slice/crio-36cecfe444ad98e36ef863ef87f8ff30a9f0a63dd5536a5ba92110198b2ec526 WatchSource:0}: Error finding container 36cecfe444ad98e36ef863ef87f8ff30a9f0a63dd5536a5ba92110198b2ec526: Status 404 returned error can't find the container with id 36cecfe444ad98e36ef863ef87f8ff30a9f0a63dd5536a5ba92110198b2ec526 Apr 24 19:13:53.061007 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:53.060982 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" Apr 24 19:13:53.181109 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:53.181085 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx"] Apr 24 19:13:53.183227 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:13:53.183201 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod457a2da5_4584_4b53_9c42_5fb9d6640940.slice/crio-032f44c5bc9a25f003a88060ca6b4ccb4ea07510e6ca2b71df329630c23ef78f WatchSource:0}: Error finding container 032f44c5bc9a25f003a88060ca6b4ccb4ea07510e6ca2b71df329630c23ef78f: Status 404 returned error can't find the container with id 032f44c5bc9a25f003a88060ca6b4ccb4ea07510e6ca2b71df329630c23ef78f Apr 24 19:13:53.775346 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:53.775275 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" event={"ID":"457a2da5-4584-4b53-9c42-5fb9d6640940","Type":"ContainerStarted","Data":"032f44c5bc9a25f003a88060ca6b4ccb4ea07510e6ca2b71df329630c23ef78f"} Apr 24 19:13:53.778232 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:13:53.778162 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" event={"ID":"03f34821-7441-4348-8dcf-d4abd092e54b","Type":"ContainerStarted","Data":"36cecfe444ad98e36ef863ef87f8ff30a9f0a63dd5536a5ba92110198b2ec526"} Apr 24 19:14:08.834680 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:08.834640 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" event={"ID":"03f34821-7441-4348-8dcf-d4abd092e54b","Type":"ContainerStarted","Data":"74fc43f63dd728726bbf1028f57ac22b519ce156e6fe5409986d9ac38712de56"} Apr 24 19:14:08.835879 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:08.835856 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" event={"ID":"457a2da5-4584-4b53-9c42-5fb9d6640940","Type":"ContainerStarted","Data":"71a77d4930cdef6d608e8a05f199c1954f80e361fae5e81e90d7c10e33c96e17"} Apr 24 19:14:11.846544 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:11.846512 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" event={"ID":"03f34821-7441-4348-8dcf-d4abd092e54b","Type":"ContainerStarted","Data":"3583d472a4b9a177a0046d47332f2fb775ce263497ae3778a45bbf7fc9c5e8b6"} Apr 24 19:14:11.846990 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:11.846687 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" Apr 24 19:14:11.848047 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:11.848023 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" event={"ID":"457a2da5-4584-4b53-9c42-5fb9d6640940","Type":"ContainerStarted","Data":"11cd71bd95834bb54d77abc10647f61b78700aba57aceb7cbc1f2ed6a56fe36a"} Apr 24 19:14:11.848281 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:11.848264 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" Apr 24 19:14:11.848336 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:11.848293 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" Apr 24 19:14:11.849641 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:11.849618 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" podUID="457a2da5-4584-4b53-9c42-5fb9d6640940" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 19:14:11.865528 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:11.865481 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" podStartSLOduration=2.473618163 podStartE2EDuration="20.865468612s" podCreationTimestamp="2026-04-24 19:13:51 +0000 UTC" firstStartedPulling="2026-04-24 19:13:52.915498165 +0000 UTC m=+470.032838533" lastFinishedPulling="2026-04-24 19:14:11.307348614 +0000 UTC m=+488.424688982" observedRunningTime="2026-04-24 19:14:11.864671923 +0000 UTC m=+488.982012312" watchObservedRunningTime="2026-04-24 19:14:11.865468612 +0000 UTC m=+488.982809002" Apr 24 19:14:11.881974 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:11.881935 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" podStartSLOduration=1.752046763 podStartE2EDuration="19.881926625s" podCreationTimestamp="2026-04-24 19:13:52 +0000 UTC" firstStartedPulling="2026-04-24 19:13:53.185300915 +0000 UTC m=+470.302641285" lastFinishedPulling="2026-04-24 19:14:11.315180779 +0000 UTC m=+488.432521147" observedRunningTime="2026-04-24 19:14:11.880771955 +0000 UTC m=+488.998112342" watchObservedRunningTime="2026-04-24 19:14:11.881926625 +0000 UTC m=+488.999267013" Apr 24 19:14:12.851195 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:12.851160 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" Apr 24 19:14:12.851660 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:12.851204 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" podUID="457a2da5-4584-4b53-9c42-5fb9d6640940" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 19:14:12.852397 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:12.852362 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" podUID="03f34821-7441-4348-8dcf-d4abd092e54b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 19:14:13.854561 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:13.854517 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" podUID="03f34821-7441-4348-8dcf-d4abd092e54b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 19:14:17.855516 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:17.855485 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" Apr 24 19:14:17.855956 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:17.855931 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" podUID="457a2da5-4584-4b53-9c42-5fb9d6640940" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 19:14:18.858498 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:18.858473 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" Apr 24 19:14:18.859024 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:18.858999 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" podUID="03f34821-7441-4348-8dcf-d4abd092e54b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 19:14:27.856054 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:27.856015 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" podUID="457a2da5-4584-4b53-9c42-5fb9d6640940" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 19:14:28.859957 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:28.859912 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" podUID="03f34821-7441-4348-8dcf-d4abd092e54b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 19:14:37.856412 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:37.856369 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" podUID="457a2da5-4584-4b53-9c42-5fb9d6640940" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 19:14:38.858989 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:38.858952 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" podUID="03f34821-7441-4348-8dcf-d4abd092e54b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 19:14:47.856786 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:47.856700 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" podUID="457a2da5-4584-4b53-9c42-5fb9d6640940" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 19:14:48.859283 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:48.859233 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" podUID="03f34821-7441-4348-8dcf-d4abd092e54b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 19:14:57.856597 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:57.856567 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" Apr 24 19:14:58.859799 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:14:58.859769 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" Apr 24 19:15:21.949164 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:21.949133 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx"] Apr 24 19:15:21.949590 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:21.949410 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" podUID="03f34821-7441-4348-8dcf-d4abd092e54b" containerName="kserve-container" containerID="cri-o://74fc43f63dd728726bbf1028f57ac22b519ce156e6fe5409986d9ac38712de56" gracePeriod=30 Apr 24 19:15:21.949590 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:21.949469 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" podUID="03f34821-7441-4348-8dcf-d4abd092e54b" containerName="kube-rbac-proxy" containerID="cri-o://3583d472a4b9a177a0046d47332f2fb775ce263497ae3778a45bbf7fc9c5e8b6" gracePeriod=30 Apr 24 19:15:22.000351 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.000320 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx"] Apr 24 19:15:22.000701 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.000651 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" podUID="457a2da5-4584-4b53-9c42-5fb9d6640940" containerName="kserve-container" containerID="cri-o://71a77d4930cdef6d608e8a05f199c1954f80e361fae5e81e90d7c10e33c96e17" gracePeriod=30 Apr 24 19:15:22.000849 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.000673 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" podUID="457a2da5-4584-4b53-9c42-5fb9d6640940" containerName="kube-rbac-proxy" containerID="cri-o://11cd71bd95834bb54d77abc10647f61b78700aba57aceb7cbc1f2ed6a56fe36a" gracePeriod=30 Apr 24 19:15:22.021536 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.021510 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv"] Apr 24 19:15:22.024827 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.024809 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" Apr 24 19:15:22.027121 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.027102 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-d5820-predictor-serving-cert\"" Apr 24 19:15:22.027121 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.027112 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-d5820-kube-rbac-proxy-sar-config\"" Apr 24 19:15:22.033720 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.033699 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv"] Apr 24 19:15:22.118623 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.118599 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx"] Apr 24 19:15:22.122204 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.122186 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" Apr 24 19:15:22.124915 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.124893 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-d5820-kube-rbac-proxy-sar-config\"" Apr 24 19:15:22.124915 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.124893 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-d5820-predictor-serving-cert\"" Apr 24 19:15:22.132857 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.132837 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx"] Apr 24 19:15:22.213638 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.213564 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6z6d\" (UniqueName: \"kubernetes.io/projected/b2a3743c-17ad-490b-b19b-1434fb28d669-kube-api-access-c6z6d\") pod \"success-200-isvc-d5820-predictor-5c958b5f7c-995rv\" (UID: \"b2a3743c-17ad-490b-b19b-1434fb28d669\") " pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" Apr 24 19:15:22.213638 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.213600 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-d5820-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b2a3743c-17ad-490b-b19b-1434fb28d669-success-200-isvc-d5820-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d5820-predictor-5c958b5f7c-995rv\" (UID: \"b2a3743c-17ad-490b-b19b-1434fb28d669\") " pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" Apr 24 19:15:22.213814 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.213693 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b2a3743c-17ad-490b-b19b-1434fb28d669-proxy-tls\") pod \"success-200-isvc-d5820-predictor-5c958b5f7c-995rv\" (UID: \"b2a3743c-17ad-490b-b19b-1434fb28d669\") " pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" Apr 24 19:15:22.314638 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.314605 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6z6d\" (UniqueName: \"kubernetes.io/projected/b2a3743c-17ad-490b-b19b-1434fb28d669-kube-api-access-c6z6d\") pod \"success-200-isvc-d5820-predictor-5c958b5f7c-995rv\" (UID: \"b2a3743c-17ad-490b-b19b-1434fb28d669\") " pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" Apr 24 19:15:22.314638 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.314645 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a8cdfef-d943-4b8b-8641-ef57122233aa-proxy-tls\") pod \"error-404-isvc-d5820-predictor-5b7d74d57d-j44tx\" (UID: \"3a8cdfef-d943-4b8b-8641-ef57122233aa\") " pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" Apr 24 19:15:22.314867 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.314667 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-d5820-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b2a3743c-17ad-490b-b19b-1434fb28d669-success-200-isvc-d5820-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d5820-predictor-5c958b5f7c-995rv\" (UID: \"b2a3743c-17ad-490b-b19b-1434fb28d669\") " pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" Apr 24 19:15:22.314867 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.314706 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxjvc\" (UniqueName: \"kubernetes.io/projected/3a8cdfef-d943-4b8b-8641-ef57122233aa-kube-api-access-mxjvc\") pod \"error-404-isvc-d5820-predictor-5b7d74d57d-j44tx\" (UID: \"3a8cdfef-d943-4b8b-8641-ef57122233aa\") " pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" Apr 24 19:15:22.314867 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.314731 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-d5820-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3a8cdfef-d943-4b8b-8641-ef57122233aa-error-404-isvc-d5820-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d5820-predictor-5b7d74d57d-j44tx\" (UID: \"3a8cdfef-d943-4b8b-8641-ef57122233aa\") " pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" Apr 24 19:15:22.314867 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.314792 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b2a3743c-17ad-490b-b19b-1434fb28d669-proxy-tls\") pod \"success-200-isvc-d5820-predictor-5c958b5f7c-995rv\" (UID: \"b2a3743c-17ad-490b-b19b-1434fb28d669\") " pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" Apr 24 19:15:22.315089 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:15:22.314891 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-d5820-predictor-serving-cert: secret "success-200-isvc-d5820-predictor-serving-cert" not found Apr 24 19:15:22.315089 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:15:22.314962 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a3743c-17ad-490b-b19b-1434fb28d669-proxy-tls podName:b2a3743c-17ad-490b-b19b-1434fb28d669 nodeName:}" failed. No retries permitted until 2026-04-24 19:15:22.814938922 +0000 UTC m=+559.932279293 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b2a3743c-17ad-490b-b19b-1434fb28d669-proxy-tls") pod "success-200-isvc-d5820-predictor-5c958b5f7c-995rv" (UID: "b2a3743c-17ad-490b-b19b-1434fb28d669") : secret "success-200-isvc-d5820-predictor-serving-cert" not found Apr 24 19:15:22.315364 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.315346 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-d5820-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b2a3743c-17ad-490b-b19b-1434fb28d669-success-200-isvc-d5820-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d5820-predictor-5c958b5f7c-995rv\" (UID: \"b2a3743c-17ad-490b-b19b-1434fb28d669\") " pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" Apr 24 19:15:22.323048 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.323021 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6z6d\" (UniqueName: \"kubernetes.io/projected/b2a3743c-17ad-490b-b19b-1434fb28d669-kube-api-access-c6z6d\") pod \"success-200-isvc-d5820-predictor-5c958b5f7c-995rv\" (UID: \"b2a3743c-17ad-490b-b19b-1434fb28d669\") " pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" Apr 24 19:15:22.415892 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.415858 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a8cdfef-d943-4b8b-8641-ef57122233aa-proxy-tls\") pod \"error-404-isvc-d5820-predictor-5b7d74d57d-j44tx\" (UID: \"3a8cdfef-d943-4b8b-8641-ef57122233aa\") " pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" Apr 24 19:15:22.415892 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.415896 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxjvc\" (UniqueName: \"kubernetes.io/projected/3a8cdfef-d943-4b8b-8641-ef57122233aa-kube-api-access-mxjvc\") pod \"error-404-isvc-d5820-predictor-5b7d74d57d-j44tx\" (UID: \"3a8cdfef-d943-4b8b-8641-ef57122233aa\") " pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" Apr 24 19:15:22.416077 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.415929 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-d5820-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3a8cdfef-d943-4b8b-8641-ef57122233aa-error-404-isvc-d5820-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d5820-predictor-5b7d74d57d-j44tx\" (UID: \"3a8cdfef-d943-4b8b-8641-ef57122233aa\") " pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" Apr 24 19:15:22.416077 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:15:22.416015 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-d5820-predictor-serving-cert: secret "error-404-isvc-d5820-predictor-serving-cert" not found Apr 24 19:15:22.416157 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:15:22.416080 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a8cdfef-d943-4b8b-8641-ef57122233aa-proxy-tls podName:3a8cdfef-d943-4b8b-8641-ef57122233aa nodeName:}" failed. No retries permitted until 2026-04-24 19:15:22.916060663 +0000 UTC m=+560.033401031 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/3a8cdfef-d943-4b8b-8641-ef57122233aa-proxy-tls") pod "error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" (UID: "3a8cdfef-d943-4b8b-8641-ef57122233aa") : secret "error-404-isvc-d5820-predictor-serving-cert" not found Apr 24 19:15:22.416589 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.416571 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-d5820-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3a8cdfef-d943-4b8b-8641-ef57122233aa-error-404-isvc-d5820-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d5820-predictor-5b7d74d57d-j44tx\" (UID: \"3a8cdfef-d943-4b8b-8641-ef57122233aa\") " pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" Apr 24 19:15:22.424908 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.424891 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxjvc\" (UniqueName: \"kubernetes.io/projected/3a8cdfef-d943-4b8b-8641-ef57122233aa-kube-api-access-mxjvc\") pod \"error-404-isvc-d5820-predictor-5b7d74d57d-j44tx\" (UID: \"3a8cdfef-d943-4b8b-8641-ef57122233aa\") " pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" Apr 24 19:15:22.819816 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.819776 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b2a3743c-17ad-490b-b19b-1434fb28d669-proxy-tls\") pod \"success-200-isvc-d5820-predictor-5c958b5f7c-995rv\" (UID: \"b2a3743c-17ad-490b-b19b-1434fb28d669\") " pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" Apr 24 19:15:22.822346 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.822322 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b2a3743c-17ad-490b-b19b-1434fb28d669-proxy-tls\") pod \"success-200-isvc-d5820-predictor-5c958b5f7c-995rv\" (UID: \"b2a3743c-17ad-490b-b19b-1434fb28d669\") " pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" Apr 24 19:15:22.851482 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.851425 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" podUID="457a2da5-4584-4b53-9c42-5fb9d6640940" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.23:8643/healthz\": dial tcp 10.133.0.23:8643: connect: connection refused" Apr 24 19:15:22.920381 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.920346 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a8cdfef-d943-4b8b-8641-ef57122233aa-proxy-tls\") pod \"error-404-isvc-d5820-predictor-5b7d74d57d-j44tx\" (UID: \"3a8cdfef-d943-4b8b-8641-ef57122233aa\") " pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" Apr 24 19:15:22.922825 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.922801 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a8cdfef-d943-4b8b-8641-ef57122233aa-proxy-tls\") pod \"error-404-isvc-d5820-predictor-5b7d74d57d-j44tx\" (UID: \"3a8cdfef-d943-4b8b-8641-ef57122233aa\") " pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" Apr 24 19:15:22.936712 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:22.936691 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" Apr 24 19:15:23.033804 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:23.033775 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" Apr 24 19:15:23.043286 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:23.043062 2573 generic.go:358] "Generic (PLEG): container finished" podID="457a2da5-4584-4b53-9c42-5fb9d6640940" containerID="11cd71bd95834bb54d77abc10647f61b78700aba57aceb7cbc1f2ed6a56fe36a" exitCode=2 Apr 24 19:15:23.043286 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:23.043129 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" event={"ID":"457a2da5-4584-4b53-9c42-5fb9d6640940","Type":"ContainerDied","Data":"11cd71bd95834bb54d77abc10647f61b78700aba57aceb7cbc1f2ed6a56fe36a"} Apr 24 19:15:23.044996 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:23.044967 2573 generic.go:358] "Generic (PLEG): container finished" podID="03f34821-7441-4348-8dcf-d4abd092e54b" containerID="3583d472a4b9a177a0046d47332f2fb775ce263497ae3778a45bbf7fc9c5e8b6" exitCode=2 Apr 24 19:15:23.045095 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:23.045032 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" event={"ID":"03f34821-7441-4348-8dcf-d4abd092e54b","Type":"ContainerDied","Data":"3583d472a4b9a177a0046d47332f2fb775ce263497ae3778a45bbf7fc9c5e8b6"} Apr 24 19:15:23.061060 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:23.060946 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv"] Apr 24 19:15:23.063512 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:15:23.063479 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2a3743c_17ad_490b_b19b_1434fb28d669.slice/crio-70c092b38764eaab2f08bd16e514192f921b73f336e79b7d808ee52a8952a916 WatchSource:0}: Error finding container 70c092b38764eaab2f08bd16e514192f921b73f336e79b7d808ee52a8952a916: Status 404 returned error can't find the container with id 70c092b38764eaab2f08bd16e514192f921b73f336e79b7d808ee52a8952a916 Apr 24 19:15:23.161069 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:23.160977 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx"] Apr 24 19:15:23.163649 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:15:23.163622 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a8cdfef_d943_4b8b_8641_ef57122233aa.slice/crio-4a673b1c612ca6f83416462eb1832519cf777ff953748dab1ab9d0f0929f9562 WatchSource:0}: Error finding container 4a673b1c612ca6f83416462eb1832519cf777ff953748dab1ab9d0f0929f9562: Status 404 returned error can't find the container with id 4a673b1c612ca6f83416462eb1832519cf777ff953748dab1ab9d0f0929f9562 Apr 24 19:15:23.855618 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:23.855569 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" podUID="03f34821-7441-4348-8dcf-d4abd092e54b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.22:8643/healthz\": dial tcp 10.133.0.22:8643: connect: connection refused" Apr 24 19:15:24.049871 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:24.049835 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" event={"ID":"b2a3743c-17ad-490b-b19b-1434fb28d669","Type":"ContainerStarted","Data":"ea407126ed9854bba7dbd2aef0e5ec6e4daf4efacedf58ef269a6e0e6f752e9f"} Apr 24 19:15:24.049871 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:24.049875 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" event={"ID":"b2a3743c-17ad-490b-b19b-1434fb28d669","Type":"ContainerStarted","Data":"dcbe55b26169184d87746f52a9b58e940f8ddf1905a0e453167bdb19cdc06416"} Apr 24 19:15:24.050391 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:24.049885 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" event={"ID":"b2a3743c-17ad-490b-b19b-1434fb28d669","Type":"ContainerStarted","Data":"70c092b38764eaab2f08bd16e514192f921b73f336e79b7d808ee52a8952a916"} Apr 24 19:15:24.050391 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:24.050034 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" Apr 24 19:15:24.050391 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:24.050157 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" Apr 24 19:15:24.051477 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:24.051426 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" event={"ID":"3a8cdfef-d943-4b8b-8641-ef57122233aa","Type":"ContainerStarted","Data":"80f5a50efd1ec708e45f1e8d45a99295f5f168f37126696b41f2aea03e6e66c2"} Apr 24 19:15:24.051477 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:24.051467 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" event={"ID":"3a8cdfef-d943-4b8b-8641-ef57122233aa","Type":"ContainerStarted","Data":"f3e2694159c02b9e84fc6bc65a1b2446112581f3e7cdfbedf3ea780754afcaaf"} Apr 24 19:15:24.051477 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:24.051476 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" event={"ID":"3a8cdfef-d943-4b8b-8641-ef57122233aa","Type":"ContainerStarted","Data":"4a673b1c612ca6f83416462eb1832519cf777ff953748dab1ab9d0f0929f9562"} Apr 24 19:15:24.051699 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:24.051680 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" Apr 24 19:15:24.051758 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:24.051710 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" Apr 24 19:15:24.051758 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:24.051738 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" podUID="b2a3743c-17ad-490b-b19b-1434fb28d669" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 19:15:24.052670 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:24.052646 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" podUID="3a8cdfef-d943-4b8b-8641-ef57122233aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 19:15:24.069212 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:24.069157 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" podStartSLOduration=2.069141195 podStartE2EDuration="2.069141195s" podCreationTimestamp="2026-04-24 19:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:15:24.06753589 +0000 UTC m=+561.184876280" watchObservedRunningTime="2026-04-24 19:15:24.069141195 +0000 UTC m=+561.186481584" Apr 24 19:15:24.085363 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:24.085312 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" podStartSLOduration=2.085299172 podStartE2EDuration="2.085299172s" podCreationTimestamp="2026-04-24 19:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:15:24.083965036 +0000 UTC m=+561.201305423" watchObservedRunningTime="2026-04-24 19:15:24.085299172 +0000 UTC m=+561.202639557" Apr 24 19:15:25.054902 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:25.054857 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" podUID="3a8cdfef-d943-4b8b-8641-ef57122233aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 19:15:25.055328 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:25.054960 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" podUID="b2a3743c-17ad-490b-b19b-1434fb28d669" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 19:15:25.430529 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:25.430506 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" Apr 24 19:15:25.437054 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:25.437029 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-48e5e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/03f34821-7441-4348-8dcf-d4abd092e54b-success-200-isvc-48e5e-kube-rbac-proxy-sar-config\") pod \"03f34821-7441-4348-8dcf-d4abd092e54b\" (UID: \"03f34821-7441-4348-8dcf-d4abd092e54b\") " Apr 24 19:15:25.437126 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:25.437086 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7stv\" (UniqueName: \"kubernetes.io/projected/03f34821-7441-4348-8dcf-d4abd092e54b-kube-api-access-k7stv\") pod \"03f34821-7441-4348-8dcf-d4abd092e54b\" (UID: \"03f34821-7441-4348-8dcf-d4abd092e54b\") " Apr 24 19:15:25.437168 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:25.437136 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/03f34821-7441-4348-8dcf-d4abd092e54b-proxy-tls\") pod \"03f34821-7441-4348-8dcf-d4abd092e54b\" (UID: \"03f34821-7441-4348-8dcf-d4abd092e54b\") " Apr 24 19:15:25.437463 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:25.437417 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03f34821-7441-4348-8dcf-d4abd092e54b-success-200-isvc-48e5e-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-48e5e-kube-rbac-proxy-sar-config") pod "03f34821-7441-4348-8dcf-d4abd092e54b" (UID: "03f34821-7441-4348-8dcf-d4abd092e54b"). InnerVolumeSpecName "success-200-isvc-48e5e-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:15:25.439366 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:25.439343 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03f34821-7441-4348-8dcf-d4abd092e54b-kube-api-access-k7stv" (OuterVolumeSpecName: "kube-api-access-k7stv") pod "03f34821-7441-4348-8dcf-d4abd092e54b" (UID: "03f34821-7441-4348-8dcf-d4abd092e54b"). InnerVolumeSpecName "kube-api-access-k7stv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:15:25.439481 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:25.439386 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03f34821-7441-4348-8dcf-d4abd092e54b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "03f34821-7441-4348-8dcf-d4abd092e54b" (UID: "03f34821-7441-4348-8dcf-d4abd092e54b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:15:25.532299 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:25.532275 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" Apr 24 19:15:25.538271 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:25.538249 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dbvh\" (UniqueName: \"kubernetes.io/projected/457a2da5-4584-4b53-9c42-5fb9d6640940-kube-api-access-4dbvh\") pod \"457a2da5-4584-4b53-9c42-5fb9d6640940\" (UID: \"457a2da5-4584-4b53-9c42-5fb9d6640940\") " Apr 24 19:15:25.538383 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:25.538295 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-48e5e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/457a2da5-4584-4b53-9c42-5fb9d6640940-error-404-isvc-48e5e-kube-rbac-proxy-sar-config\") pod \"457a2da5-4584-4b53-9c42-5fb9d6640940\" (UID: \"457a2da5-4584-4b53-9c42-5fb9d6640940\") " Apr 24 19:15:25.538383 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:25.538328 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/457a2da5-4584-4b53-9c42-5fb9d6640940-proxy-tls\") pod \"457a2da5-4584-4b53-9c42-5fb9d6640940\" (UID: \"457a2da5-4584-4b53-9c42-5fb9d6640940\") " Apr 24 19:15:25.538555 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:25.538541 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k7stv\" (UniqueName: \"kubernetes.io/projected/03f34821-7441-4348-8dcf-d4abd092e54b-kube-api-access-k7stv\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:15:25.538611 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:25.538561 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/03f34821-7441-4348-8dcf-d4abd092e54b-proxy-tls\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:15:25.538611 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:25.538576 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-48e5e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/03f34821-7441-4348-8dcf-d4abd092e54b-success-200-isvc-48e5e-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:15:25.538676 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:25.538612 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/457a2da5-4584-4b53-9c42-5fb9d6640940-error-404-isvc-48e5e-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-48e5e-kube-rbac-proxy-sar-config") pod "457a2da5-4584-4b53-9c42-5fb9d6640940" (UID: "457a2da5-4584-4b53-9c42-5fb9d6640940"). InnerVolumeSpecName "error-404-isvc-48e5e-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:15:25.540306 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:25.540287 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/457a2da5-4584-4b53-9c42-5fb9d6640940-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "457a2da5-4584-4b53-9c42-5fb9d6640940" (UID: "457a2da5-4584-4b53-9c42-5fb9d6640940"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:15:25.540348 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:25.540330 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/457a2da5-4584-4b53-9c42-5fb9d6640940-kube-api-access-4dbvh" (OuterVolumeSpecName: "kube-api-access-4dbvh") pod "457a2da5-4584-4b53-9c42-5fb9d6640940" (UID: "457a2da5-4584-4b53-9c42-5fb9d6640940"). InnerVolumeSpecName "kube-api-access-4dbvh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:15:25.644098 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:25.643999 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4dbvh\" (UniqueName: \"kubernetes.io/projected/457a2da5-4584-4b53-9c42-5fb9d6640940-kube-api-access-4dbvh\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:15:25.644098 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:25.644068 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-48e5e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/457a2da5-4584-4b53-9c42-5fb9d6640940-error-404-isvc-48e5e-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:15:25.644298 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:25.644099 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/457a2da5-4584-4b53-9c42-5fb9d6640940-proxy-tls\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:15:26.059197 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.059153 2573 generic.go:358] "Generic (PLEG): container finished" podID="03f34821-7441-4348-8dcf-d4abd092e54b" containerID="74fc43f63dd728726bbf1028f57ac22b519ce156e6fe5409986d9ac38712de56" exitCode=0 Apr 24 19:15:26.059197 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.059192 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" event={"ID":"03f34821-7441-4348-8dcf-d4abd092e54b","Type":"ContainerDied","Data":"74fc43f63dd728726bbf1028f57ac22b519ce156e6fe5409986d9ac38712de56"} Apr 24 19:15:26.059759 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.059235 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" event={"ID":"03f34821-7441-4348-8dcf-d4abd092e54b","Type":"ContainerDied","Data":"36cecfe444ad98e36ef863ef87f8ff30a9f0a63dd5536a5ba92110198b2ec526"} Apr 24 19:15:26.059759 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.059251 2573 scope.go:117] "RemoveContainer" containerID="3583d472a4b9a177a0046d47332f2fb775ce263497ae3778a45bbf7fc9c5e8b6" Apr 24 19:15:26.059759 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.059254 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx" Apr 24 19:15:26.060766 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.060738 2573 generic.go:358] "Generic (PLEG): container finished" podID="457a2da5-4584-4b53-9c42-5fb9d6640940" containerID="71a77d4930cdef6d608e8a05f199c1954f80e361fae5e81e90d7c10e33c96e17" exitCode=0 Apr 24 19:15:26.060874 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.060790 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" event={"ID":"457a2da5-4584-4b53-9c42-5fb9d6640940","Type":"ContainerDied","Data":"71a77d4930cdef6d608e8a05f199c1954f80e361fae5e81e90d7c10e33c96e17"} Apr 24 19:15:26.060874 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.060816 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" event={"ID":"457a2da5-4584-4b53-9c42-5fb9d6640940","Type":"ContainerDied","Data":"032f44c5bc9a25f003a88060ca6b4ccb4ea07510e6ca2b71df329630c23ef78f"} Apr 24 19:15:26.060874 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.060828 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx" Apr 24 19:15:26.067808 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.067789 2573 scope.go:117] "RemoveContainer" containerID="74fc43f63dd728726bbf1028f57ac22b519ce156e6fe5409986d9ac38712de56" Apr 24 19:15:26.075632 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.075615 2573 scope.go:117] "RemoveContainer" containerID="3583d472a4b9a177a0046d47332f2fb775ce263497ae3778a45bbf7fc9c5e8b6" Apr 24 19:15:26.075884 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:15:26.075866 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3583d472a4b9a177a0046d47332f2fb775ce263497ae3778a45bbf7fc9c5e8b6\": container with ID starting with 3583d472a4b9a177a0046d47332f2fb775ce263497ae3778a45bbf7fc9c5e8b6 not found: ID does not exist" containerID="3583d472a4b9a177a0046d47332f2fb775ce263497ae3778a45bbf7fc9c5e8b6" Apr 24 19:15:26.075938 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.075892 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3583d472a4b9a177a0046d47332f2fb775ce263497ae3778a45bbf7fc9c5e8b6"} err="failed to get container status \"3583d472a4b9a177a0046d47332f2fb775ce263497ae3778a45bbf7fc9c5e8b6\": rpc error: code = NotFound desc = could not find container \"3583d472a4b9a177a0046d47332f2fb775ce263497ae3778a45bbf7fc9c5e8b6\": container with ID starting with 3583d472a4b9a177a0046d47332f2fb775ce263497ae3778a45bbf7fc9c5e8b6 not found: ID does not exist" Apr 24 19:15:26.075938 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.075910 2573 scope.go:117] "RemoveContainer" containerID="74fc43f63dd728726bbf1028f57ac22b519ce156e6fe5409986d9ac38712de56" Apr 24 19:15:26.076098 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:15:26.076082 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74fc43f63dd728726bbf1028f57ac22b519ce156e6fe5409986d9ac38712de56\": container with ID starting with 74fc43f63dd728726bbf1028f57ac22b519ce156e6fe5409986d9ac38712de56 not found: ID does not exist" containerID="74fc43f63dd728726bbf1028f57ac22b519ce156e6fe5409986d9ac38712de56" Apr 24 19:15:26.076156 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.076102 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74fc43f63dd728726bbf1028f57ac22b519ce156e6fe5409986d9ac38712de56"} err="failed to get container status \"74fc43f63dd728726bbf1028f57ac22b519ce156e6fe5409986d9ac38712de56\": rpc error: code = NotFound desc = could not find container \"74fc43f63dd728726bbf1028f57ac22b519ce156e6fe5409986d9ac38712de56\": container with ID starting with 74fc43f63dd728726bbf1028f57ac22b519ce156e6fe5409986d9ac38712de56 not found: ID does not exist" Apr 24 19:15:26.076156 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.076115 2573 scope.go:117] "RemoveContainer" containerID="11cd71bd95834bb54d77abc10647f61b78700aba57aceb7cbc1f2ed6a56fe36a" Apr 24 19:15:26.084849 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.084772 2573 scope.go:117] "RemoveContainer" containerID="71a77d4930cdef6d608e8a05f199c1954f80e361fae5e81e90d7c10e33c96e17" Apr 24 19:15:26.085963 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.085940 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx"] Apr 24 19:15:26.087666 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.087648 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-48e5e-predictor-85bc6cd4f-z28gx"] Apr 24 19:15:26.092309 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.092286 2573 scope.go:117] "RemoveContainer" containerID="11cd71bd95834bb54d77abc10647f61b78700aba57aceb7cbc1f2ed6a56fe36a" Apr 24 19:15:26.092638 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:15:26.092618 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11cd71bd95834bb54d77abc10647f61b78700aba57aceb7cbc1f2ed6a56fe36a\": container with ID starting with 11cd71bd95834bb54d77abc10647f61b78700aba57aceb7cbc1f2ed6a56fe36a not found: ID does not exist" containerID="11cd71bd95834bb54d77abc10647f61b78700aba57aceb7cbc1f2ed6a56fe36a" Apr 24 19:15:26.092711 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.092645 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11cd71bd95834bb54d77abc10647f61b78700aba57aceb7cbc1f2ed6a56fe36a"} err="failed to get container status \"11cd71bd95834bb54d77abc10647f61b78700aba57aceb7cbc1f2ed6a56fe36a\": rpc error: code = NotFound desc = could not find container \"11cd71bd95834bb54d77abc10647f61b78700aba57aceb7cbc1f2ed6a56fe36a\": container with ID starting with 11cd71bd95834bb54d77abc10647f61b78700aba57aceb7cbc1f2ed6a56fe36a not found: ID does not exist" Apr 24 19:15:26.092711 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.092663 2573 scope.go:117] "RemoveContainer" containerID="71a77d4930cdef6d608e8a05f199c1954f80e361fae5e81e90d7c10e33c96e17" Apr 24 19:15:26.092899 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:15:26.092883 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71a77d4930cdef6d608e8a05f199c1954f80e361fae5e81e90d7c10e33c96e17\": container with ID starting with 71a77d4930cdef6d608e8a05f199c1954f80e361fae5e81e90d7c10e33c96e17 not found: ID does not exist" containerID="71a77d4930cdef6d608e8a05f199c1954f80e361fae5e81e90d7c10e33c96e17" Apr 24 19:15:26.092936 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.092904 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71a77d4930cdef6d608e8a05f199c1954f80e361fae5e81e90d7c10e33c96e17"} err="failed to get container status \"71a77d4930cdef6d608e8a05f199c1954f80e361fae5e81e90d7c10e33c96e17\": rpc error: code = NotFound desc = could not find container \"71a77d4930cdef6d608e8a05f199c1954f80e361fae5e81e90d7c10e33c96e17\": container with ID starting with 71a77d4930cdef6d608e8a05f199c1954f80e361fae5e81e90d7c10e33c96e17 not found: ID does not exist" Apr 24 19:15:26.098075 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.098056 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx"] Apr 24 19:15:26.103783 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:26.103761 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-48e5e-predictor-5fd594f674-4dgzx"] Apr 24 19:15:27.379869 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:27.379835 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03f34821-7441-4348-8dcf-d4abd092e54b" path="/var/lib/kubelet/pods/03f34821-7441-4348-8dcf-d4abd092e54b/volumes" Apr 24 19:15:27.380241 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:27.380228 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="457a2da5-4584-4b53-9c42-5fb9d6640940" path="/var/lib/kubelet/pods/457a2da5-4584-4b53-9c42-5fb9d6640940/volumes" Apr 24 19:15:30.059418 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:30.059387 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" Apr 24 19:15:30.059983 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:30.059586 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" Apr 24 19:15:30.059983 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:30.059792 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" podUID="3a8cdfef-d943-4b8b-8641-ef57122233aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 19:15:30.060237 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:30.060212 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" podUID="b2a3743c-17ad-490b-b19b-1434fb28d669" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 19:15:40.060635 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:40.060597 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" podUID="3a8cdfef-d943-4b8b-8641-ef57122233aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 19:15:40.060996 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:40.060601 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" podUID="b2a3743c-17ad-490b-b19b-1434fb28d669" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 19:15:50.060055 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:50.060013 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" podUID="3a8cdfef-d943-4b8b-8641-ef57122233aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 19:15:50.060423 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:15:50.060147 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" podUID="b2a3743c-17ad-490b-b19b-1434fb28d669" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 19:16:00.059826 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:00.059787 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" podUID="3a8cdfef-d943-4b8b-8641-ef57122233aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 19:16:00.060305 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:00.060285 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" podUID="b2a3743c-17ad-490b-b19b-1434fb28d669" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 19:16:01.932706 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:01.932671 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6"] Apr 24 19:16:01.933171 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:01.933104 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03f34821-7441-4348-8dcf-d4abd092e54b" containerName="kube-rbac-proxy" Apr 24 19:16:01.933171 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:01.933125 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f34821-7441-4348-8dcf-d4abd092e54b" containerName="kube-rbac-proxy" Apr 24 19:16:01.933171 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:01.933150 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03f34821-7441-4348-8dcf-d4abd092e54b" containerName="kserve-container" Apr 24 19:16:01.933171 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:01.933159 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f34821-7441-4348-8dcf-d4abd092e54b" containerName="kserve-container" Apr 24 19:16:01.933171 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:01.933168 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="457a2da5-4584-4b53-9c42-5fb9d6640940" containerName="kube-rbac-proxy" Apr 24 19:16:01.933368 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:01.933177 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="457a2da5-4584-4b53-9c42-5fb9d6640940" containerName="kube-rbac-proxy" Apr 24 19:16:01.933368 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:01.933198 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="457a2da5-4584-4b53-9c42-5fb9d6640940" containerName="kserve-container" Apr 24 19:16:01.933368 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:01.933206 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="457a2da5-4584-4b53-9c42-5fb9d6640940" containerName="kserve-container" Apr 24 19:16:01.933368 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:01.933274 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="03f34821-7441-4348-8dcf-d4abd092e54b" containerName="kube-rbac-proxy" Apr 24 19:16:01.933368 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:01.933285 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="457a2da5-4584-4b53-9c42-5fb9d6640940" containerName="kube-rbac-proxy" Apr 24 19:16:01.933368 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:01.933297 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="03f34821-7441-4348-8dcf-d4abd092e54b" containerName="kserve-container" Apr 24 19:16:01.933368 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:01.933311 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="457a2da5-4584-4b53-9c42-5fb9d6640940" containerName="kserve-container" Apr 24 19:16:01.936907 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:01.936888 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" Apr 24 19:16:01.939288 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:01.939263 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-99560-predictor-serving-cert\"" Apr 24 19:16:01.939408 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:01.939290 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-99560-kube-rbac-proxy-sar-config\"" Apr 24 19:16:01.946563 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:01.946539 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6"] Apr 24 19:16:02.010491 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.010453 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tppl\" (UniqueName: \"kubernetes.io/projected/b9c97214-f5d6-411a-9961-05c62b416798-kube-api-access-4tppl\") pod \"success-200-isvc-99560-predictor-6f5d77689c-g76x6\" (UID: \"b9c97214-f5d6-411a-9961-05c62b416798\") " pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" Apr 24 19:16:02.010672 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.010511 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-99560-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b9c97214-f5d6-411a-9961-05c62b416798-success-200-isvc-99560-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-99560-predictor-6f5d77689c-g76x6\" (UID: \"b9c97214-f5d6-411a-9961-05c62b416798\") " pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" Apr 24 19:16:02.010672 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.010604 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9c97214-f5d6-411a-9961-05c62b416798-proxy-tls\") pod \"success-200-isvc-99560-predictor-6f5d77689c-g76x6\" (UID: \"b9c97214-f5d6-411a-9961-05c62b416798\") " pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" Apr 24 19:16:02.025085 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.025048 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57"] Apr 24 19:16:02.029755 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.029725 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" Apr 24 19:16:02.032663 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.032641 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-99560-kube-rbac-proxy-sar-config\"" Apr 24 19:16:02.032857 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.032737 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-99560-predictor-serving-cert\"" Apr 24 19:16:02.035947 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.035925 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57"] Apr 24 19:16:02.111098 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.111068 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4tppl\" (UniqueName: \"kubernetes.io/projected/b9c97214-f5d6-411a-9961-05c62b416798-kube-api-access-4tppl\") pod \"success-200-isvc-99560-predictor-6f5d77689c-g76x6\" (UID: \"b9c97214-f5d6-411a-9961-05c62b416798\") " pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" Apr 24 19:16:02.111271 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.111107 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w9v6\" (UniqueName: \"kubernetes.io/projected/60633c68-5404-4495-8e49-01e91e0f62ba-kube-api-access-2w9v6\") pod \"error-404-isvc-99560-predictor-59865b4b78-f7h57\" (UID: \"60633c68-5404-4495-8e49-01e91e0f62ba\") " pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" Apr 24 19:16:02.111271 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.111162 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-99560-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b9c97214-f5d6-411a-9961-05c62b416798-success-200-isvc-99560-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-99560-predictor-6f5d77689c-g76x6\" (UID: \"b9c97214-f5d6-411a-9961-05c62b416798\") " pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" Apr 24 19:16:02.111271 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.111181 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-99560-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/60633c68-5404-4495-8e49-01e91e0f62ba-error-404-isvc-99560-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-99560-predictor-59865b4b78-f7h57\" (UID: \"60633c68-5404-4495-8e49-01e91e0f62ba\") " pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" Apr 24 19:16:02.111271 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.111244 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/60633c68-5404-4495-8e49-01e91e0f62ba-proxy-tls\") pod \"error-404-isvc-99560-predictor-59865b4b78-f7h57\" (UID: \"60633c68-5404-4495-8e49-01e91e0f62ba\") " pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" Apr 24 19:16:02.111535 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.111276 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9c97214-f5d6-411a-9961-05c62b416798-proxy-tls\") pod \"success-200-isvc-99560-predictor-6f5d77689c-g76x6\" (UID: \"b9c97214-f5d6-411a-9961-05c62b416798\") " pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" Apr 24 19:16:02.111535 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:16:02.111386 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-99560-predictor-serving-cert: secret "success-200-isvc-99560-predictor-serving-cert" not found Apr 24 19:16:02.111535 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:16:02.111479 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9c97214-f5d6-411a-9961-05c62b416798-proxy-tls podName:b9c97214-f5d6-411a-9961-05c62b416798 nodeName:}" failed. No retries permitted until 2026-04-24 19:16:02.611457497 +0000 UTC m=+599.728797879 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b9c97214-f5d6-411a-9961-05c62b416798-proxy-tls") pod "success-200-isvc-99560-predictor-6f5d77689c-g76x6" (UID: "b9c97214-f5d6-411a-9961-05c62b416798") : secret "success-200-isvc-99560-predictor-serving-cert" not found Apr 24 19:16:02.111839 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.111821 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-99560-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b9c97214-f5d6-411a-9961-05c62b416798-success-200-isvc-99560-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-99560-predictor-6f5d77689c-g76x6\" (UID: \"b9c97214-f5d6-411a-9961-05c62b416798\") " pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" Apr 24 19:16:02.119974 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.119953 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tppl\" (UniqueName: \"kubernetes.io/projected/b9c97214-f5d6-411a-9961-05c62b416798-kube-api-access-4tppl\") pod \"success-200-isvc-99560-predictor-6f5d77689c-g76x6\" (UID: \"b9c97214-f5d6-411a-9961-05c62b416798\") " pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" Apr 24 19:16:02.212664 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.212579 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-99560-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/60633c68-5404-4495-8e49-01e91e0f62ba-error-404-isvc-99560-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-99560-predictor-59865b4b78-f7h57\" (UID: \"60633c68-5404-4495-8e49-01e91e0f62ba\") " pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" Apr 24 19:16:02.212664 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.212635 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/60633c68-5404-4495-8e49-01e91e0f62ba-proxy-tls\") pod \"error-404-isvc-99560-predictor-59865b4b78-f7h57\" (UID: \"60633c68-5404-4495-8e49-01e91e0f62ba\") " pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" Apr 24 19:16:02.212867 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.212683 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2w9v6\" (UniqueName: \"kubernetes.io/projected/60633c68-5404-4495-8e49-01e91e0f62ba-kube-api-access-2w9v6\") pod \"error-404-isvc-99560-predictor-59865b4b78-f7h57\" (UID: \"60633c68-5404-4495-8e49-01e91e0f62ba\") " pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" Apr 24 19:16:02.212867 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:16:02.212816 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-99560-predictor-serving-cert: secret "error-404-isvc-99560-predictor-serving-cert" not found Apr 24 19:16:02.212933 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:16:02.212894 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60633c68-5404-4495-8e49-01e91e0f62ba-proxy-tls podName:60633c68-5404-4495-8e49-01e91e0f62ba nodeName:}" failed. No retries permitted until 2026-04-24 19:16:02.712874327 +0000 UTC m=+599.830214693 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/60633c68-5404-4495-8e49-01e91e0f62ba-proxy-tls") pod "error-404-isvc-99560-predictor-59865b4b78-f7h57" (UID: "60633c68-5404-4495-8e49-01e91e0f62ba") : secret "error-404-isvc-99560-predictor-serving-cert" not found Apr 24 19:16:02.213237 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.213217 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-99560-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/60633c68-5404-4495-8e49-01e91e0f62ba-error-404-isvc-99560-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-99560-predictor-59865b4b78-f7h57\" (UID: \"60633c68-5404-4495-8e49-01e91e0f62ba\") " pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" Apr 24 19:16:02.221928 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.221902 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w9v6\" (UniqueName: \"kubernetes.io/projected/60633c68-5404-4495-8e49-01e91e0f62ba-kube-api-access-2w9v6\") pod \"error-404-isvc-99560-predictor-59865b4b78-f7h57\" (UID: \"60633c68-5404-4495-8e49-01e91e0f62ba\") " pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" Apr 24 19:16:02.616940 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.616890 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9c97214-f5d6-411a-9961-05c62b416798-proxy-tls\") pod \"success-200-isvc-99560-predictor-6f5d77689c-g76x6\" (UID: \"b9c97214-f5d6-411a-9961-05c62b416798\") " pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" Apr 24 19:16:02.619420 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.619389 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9c97214-f5d6-411a-9961-05c62b416798-proxy-tls\") pod \"success-200-isvc-99560-predictor-6f5d77689c-g76x6\" (UID: \"b9c97214-f5d6-411a-9961-05c62b416798\") " pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" Apr 24 19:16:02.718274 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.718240 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/60633c68-5404-4495-8e49-01e91e0f62ba-proxy-tls\") pod \"error-404-isvc-99560-predictor-59865b4b78-f7h57\" (UID: \"60633c68-5404-4495-8e49-01e91e0f62ba\") " pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" Apr 24 19:16:02.720768 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.720739 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/60633c68-5404-4495-8e49-01e91e0f62ba-proxy-tls\") pod \"error-404-isvc-99560-predictor-59865b4b78-f7h57\" (UID: \"60633c68-5404-4495-8e49-01e91e0f62ba\") " pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" Apr 24 19:16:02.848594 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.848560 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" Apr 24 19:16:02.942990 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.942960 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" Apr 24 19:16:02.972660 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:02.972530 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6"] Apr 24 19:16:02.975415 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:16:02.975384 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9c97214_f5d6_411a_9961_05c62b416798.slice/crio-338777ab90c16fdb3ed2ab4a24e662f30d60d24fc9777d5176c0aaeaddc10556 WatchSource:0}: Error finding container 338777ab90c16fdb3ed2ab4a24e662f30d60d24fc9777d5176c0aaeaddc10556: Status 404 returned error can't find the container with id 338777ab90c16fdb3ed2ab4a24e662f30d60d24fc9777d5176c0aaeaddc10556 Apr 24 19:16:03.072709 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:03.072685 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57"] Apr 24 19:16:03.079609 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:16:03.079582 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60633c68_5404_4495_8e49_01e91e0f62ba.slice/crio-65c07906e9a0cba0956f1c5985fc3db73188bdec40a78fc34a4b3f54564c66a6 WatchSource:0}: Error finding container 65c07906e9a0cba0956f1c5985fc3db73188bdec40a78fc34a4b3f54564c66a6: Status 404 returned error can't find the container with id 65c07906e9a0cba0956f1c5985fc3db73188bdec40a78fc34a4b3f54564c66a6 Apr 24 19:16:03.171580 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:03.171510 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" event={"ID":"60633c68-5404-4495-8e49-01e91e0f62ba","Type":"ContainerStarted","Data":"fab901caed66c5969d98d94879a2f893db3faa9fe856e973a61511bc6f08c601"} Apr 24 19:16:03.171580 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:03.171556 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" event={"ID":"60633c68-5404-4495-8e49-01e91e0f62ba","Type":"ContainerStarted","Data":"65c07906e9a0cba0956f1c5985fc3db73188bdec40a78fc34a4b3f54564c66a6"} Apr 24 19:16:03.173636 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:03.173608 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" event={"ID":"b9c97214-f5d6-411a-9961-05c62b416798","Type":"ContainerStarted","Data":"56fc129c2948eb2c4e0624898b4bb25d4370e01ae21ff21b16dd54b357787ae1"} Apr 24 19:16:03.173758 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:03.173642 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" event={"ID":"b9c97214-f5d6-411a-9961-05c62b416798","Type":"ContainerStarted","Data":"70682350c8a14cf7403127990530bd52e8f2a01bc959fef4a6f02188724430f7"} Apr 24 19:16:03.173758 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:03.173657 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" event={"ID":"b9c97214-f5d6-411a-9961-05c62b416798","Type":"ContainerStarted","Data":"338777ab90c16fdb3ed2ab4a24e662f30d60d24fc9777d5176c0aaeaddc10556"} Apr 24 19:16:03.173758 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:03.173736 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" Apr 24 19:16:03.195581 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:03.195533 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" podStartSLOduration=2.195519649 podStartE2EDuration="2.195519649s" podCreationTimestamp="2026-04-24 19:16:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:16:03.193185872 +0000 UTC m=+600.310526264" watchObservedRunningTime="2026-04-24 19:16:03.195519649 +0000 UTC m=+600.312860036" Apr 24 19:16:04.177574 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:04.177536 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" event={"ID":"60633c68-5404-4495-8e49-01e91e0f62ba","Type":"ContainerStarted","Data":"e16346909356b708963277905ffc1ded30ae0692e9f56b2b7e2f1e76ec94dc37"} Apr 24 19:16:04.178000 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:04.177781 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" Apr 24 19:16:04.178000 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:04.177810 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" Apr 24 19:16:04.179106 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:04.179080 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" podUID="b9c97214-f5d6-411a-9961-05c62b416798" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 19:16:04.197289 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:04.197242 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" podStartSLOduration=2.197229617 podStartE2EDuration="2.197229617s" podCreationTimestamp="2026-04-24 19:16:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:16:04.196053022 +0000 UTC m=+601.313393410" watchObservedRunningTime="2026-04-24 19:16:04.197229617 +0000 UTC m=+601.314570005" Apr 24 19:16:05.181011 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:05.180970 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" Apr 24 19:16:05.181401 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:05.181055 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" podUID="b9c97214-f5d6-411a-9961-05c62b416798" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 19:16:05.182274 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:05.182250 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" podUID="60633c68-5404-4495-8e49-01e91e0f62ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 19:16:06.184176 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:06.184144 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" podUID="60633c68-5404-4495-8e49-01e91e0f62ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 19:16:10.060341 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:10.060314 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" Apr 24 19:16:10.061076 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:10.061057 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" Apr 24 19:16:10.185577 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:10.185549 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" Apr 24 19:16:10.186109 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:10.186085 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" podUID="b9c97214-f5d6-411a-9961-05c62b416798" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 19:16:11.188466 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:11.188421 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" Apr 24 19:16:11.188852 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:11.188822 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" podUID="60633c68-5404-4495-8e49-01e91e0f62ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 19:16:20.186607 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:20.186517 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" podUID="b9c97214-f5d6-411a-9961-05c62b416798" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 19:16:21.188894 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:21.188854 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" podUID="60633c68-5404-4495-8e49-01e91e0f62ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 19:16:30.186078 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:30.186035 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" podUID="b9c97214-f5d6-411a-9961-05c62b416798" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 19:16:31.189108 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:31.189071 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" podUID="60633c68-5404-4495-8e49-01e91e0f62ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 19:16:40.186906 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:40.186818 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" podUID="b9c97214-f5d6-411a-9961-05c62b416798" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 19:16:41.189467 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:41.189412 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" podUID="60633c68-5404-4495-8e49-01e91e0f62ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 19:16:50.186614 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:50.186582 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" Apr 24 19:16:51.190192 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:16:51.190157 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" Apr 24 19:24:36.875388 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:36.875354 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv"] Apr 24 19:24:36.877861 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:36.875739 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" podUID="b2a3743c-17ad-490b-b19b-1434fb28d669" containerName="kserve-container" containerID="cri-o://dcbe55b26169184d87746f52a9b58e940f8ddf1905a0e453167bdb19cdc06416" gracePeriod=30 Apr 24 19:24:36.877861 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:36.875827 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" podUID="b2a3743c-17ad-490b-b19b-1434fb28d669" containerName="kube-rbac-proxy" containerID="cri-o://ea407126ed9854bba7dbd2aef0e5ec6e4daf4efacedf58ef269a6e0e6f752e9f" gracePeriod=30 Apr 24 19:24:36.928120 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:36.928090 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx"] Apr 24 19:24:36.928498 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:36.928463 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" podUID="3a8cdfef-d943-4b8b-8641-ef57122233aa" containerName="kserve-container" containerID="cri-o://f3e2694159c02b9e84fc6bc65a1b2446112581f3e7cdfbedf3ea780754afcaaf" gracePeriod=30 Apr 24 19:24:36.928498 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:36.928483 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" podUID="3a8cdfef-d943-4b8b-8641-ef57122233aa" containerName="kube-rbac-proxy" containerID="cri-o://80f5a50efd1ec708e45f1e8d45a99295f5f168f37126696b41f2aea03e6e66c2" gracePeriod=30 Apr 24 19:24:36.944900 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:36.944875 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs"] Apr 24 19:24:36.948253 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:36.948232 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" Apr 24 19:24:36.950574 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:36.950552 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-b2e49-predictor-serving-cert\"" Apr 24 19:24:36.950685 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:36.950594 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-b2e49-kube-rbac-proxy-sar-config\"" Apr 24 19:24:36.969595 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:36.969573 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs"] Apr 24 19:24:37.000078 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.000045 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c"] Apr 24 19:24:37.003084 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.003065 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" Apr 24 19:24:37.005362 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.005343 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-b2e49-predictor-serving-cert\"" Apr 24 19:24:37.005483 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.005388 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-b2e49-kube-rbac-proxy-sar-config\"" Apr 24 19:24:37.013680 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.013660 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c"] Apr 24 19:24:37.061954 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.061927 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-b2e49-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0ddbb448-03b5-46b3-9277-eae73d6103ba-error-404-isvc-b2e49-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c\" (UID: \"0ddbb448-03b5-46b3-9277-eae73d6103ba\") " pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" Apr 24 19:24:37.062091 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.061965 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhkfb\" (UniqueName: \"kubernetes.io/projected/0ddbb448-03b5-46b3-9277-eae73d6103ba-kube-api-access-jhkfb\") pod \"error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c\" (UID: \"0ddbb448-03b5-46b3-9277-eae73d6103ba\") " pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" Apr 24 19:24:37.062091 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.061998 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lq2x\" (UniqueName: \"kubernetes.io/projected/6a243971-45ac-410a-9e5d-ef4761931a44-kube-api-access-9lq2x\") pod \"success-200-isvc-b2e49-predictor-869fccf77f-86ffs\" (UID: \"6a243971-45ac-410a-9e5d-ef4761931a44\") " pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" Apr 24 19:24:37.062091 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.062077 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ddbb448-03b5-46b3-9277-eae73d6103ba-proxy-tls\") pod \"error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c\" (UID: \"0ddbb448-03b5-46b3-9277-eae73d6103ba\") " pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" Apr 24 19:24:37.062210 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.062108 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-b2e49-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a243971-45ac-410a-9e5d-ef4761931a44-success-200-isvc-b2e49-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-b2e49-predictor-869fccf77f-86ffs\" (UID: \"6a243971-45ac-410a-9e5d-ef4761931a44\") " pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" Apr 24 19:24:37.062210 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.062154 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a243971-45ac-410a-9e5d-ef4761931a44-proxy-tls\") pod \"success-200-isvc-b2e49-predictor-869fccf77f-86ffs\" (UID: \"6a243971-45ac-410a-9e5d-ef4761931a44\") " pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" Apr 24 19:24:37.163232 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.163155 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhkfb\" (UniqueName: \"kubernetes.io/projected/0ddbb448-03b5-46b3-9277-eae73d6103ba-kube-api-access-jhkfb\") pod \"error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c\" (UID: \"0ddbb448-03b5-46b3-9277-eae73d6103ba\") " pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" Apr 24 19:24:37.163232 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.163202 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9lq2x\" (UniqueName: \"kubernetes.io/projected/6a243971-45ac-410a-9e5d-ef4761931a44-kube-api-access-9lq2x\") pod \"success-200-isvc-b2e49-predictor-869fccf77f-86ffs\" (UID: \"6a243971-45ac-410a-9e5d-ef4761931a44\") " pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" Apr 24 19:24:37.163466 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.163342 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ddbb448-03b5-46b3-9277-eae73d6103ba-proxy-tls\") pod \"error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c\" (UID: \"0ddbb448-03b5-46b3-9277-eae73d6103ba\") " pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" Apr 24 19:24:37.163466 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.163395 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-b2e49-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a243971-45ac-410a-9e5d-ef4761931a44-success-200-isvc-b2e49-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-b2e49-predictor-869fccf77f-86ffs\" (UID: \"6a243971-45ac-410a-9e5d-ef4761931a44\") " pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" Apr 24 19:24:37.163582 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.163492 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a243971-45ac-410a-9e5d-ef4761931a44-proxy-tls\") pod \"success-200-isvc-b2e49-predictor-869fccf77f-86ffs\" (UID: \"6a243971-45ac-410a-9e5d-ef4761931a44\") " pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" Apr 24 19:24:37.163582 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.163548 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-b2e49-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0ddbb448-03b5-46b3-9277-eae73d6103ba-error-404-isvc-b2e49-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c\" (UID: \"0ddbb448-03b5-46b3-9277-eae73d6103ba\") " pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" Apr 24 19:24:37.164194 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.164125 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-b2e49-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a243971-45ac-410a-9e5d-ef4761931a44-success-200-isvc-b2e49-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-b2e49-predictor-869fccf77f-86ffs\" (UID: \"6a243971-45ac-410a-9e5d-ef4761931a44\") " pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" Apr 24 19:24:37.164194 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.164132 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-b2e49-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0ddbb448-03b5-46b3-9277-eae73d6103ba-error-404-isvc-b2e49-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c\" (UID: \"0ddbb448-03b5-46b3-9277-eae73d6103ba\") " pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" Apr 24 19:24:37.166148 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.166119 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a243971-45ac-410a-9e5d-ef4761931a44-proxy-tls\") pod \"success-200-isvc-b2e49-predictor-869fccf77f-86ffs\" (UID: \"6a243971-45ac-410a-9e5d-ef4761931a44\") " pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" Apr 24 19:24:37.166272 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.166202 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ddbb448-03b5-46b3-9277-eae73d6103ba-proxy-tls\") pod \"error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c\" (UID: \"0ddbb448-03b5-46b3-9277-eae73d6103ba\") " pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" Apr 24 19:24:37.171578 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.171558 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lq2x\" (UniqueName: \"kubernetes.io/projected/6a243971-45ac-410a-9e5d-ef4761931a44-kube-api-access-9lq2x\") pod \"success-200-isvc-b2e49-predictor-869fccf77f-86ffs\" (UID: \"6a243971-45ac-410a-9e5d-ef4761931a44\") " pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" Apr 24 19:24:37.171926 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.171903 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhkfb\" (UniqueName: \"kubernetes.io/projected/0ddbb448-03b5-46b3-9277-eae73d6103ba-kube-api-access-jhkfb\") pod \"error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c\" (UID: \"0ddbb448-03b5-46b3-9277-eae73d6103ba\") " pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" Apr 24 19:24:37.261924 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.261899 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" Apr 24 19:24:37.313518 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.313484 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" Apr 24 19:24:37.388505 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.388480 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs"] Apr 24 19:24:37.390033 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:24:37.390003 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a243971_45ac_410a_9e5d_ef4761931a44.slice/crio-194c24ffdff86cf109256d49946a33379c1aa01eaa6d675197817b86bdfae4a1 WatchSource:0}: Error finding container 194c24ffdff86cf109256d49946a33379c1aa01eaa6d675197817b86bdfae4a1: Status 404 returned error can't find the container with id 194c24ffdff86cf109256d49946a33379c1aa01eaa6d675197817b86bdfae4a1 Apr 24 19:24:37.391904 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.391884 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:24:37.452030 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.452007 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c"] Apr 24 19:24:37.454582 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:24:37.454555 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ddbb448_03b5_46b3_9277_eae73d6103ba.slice/crio-b71ba310eacbddfdf1c82866e2ba412a2319817c5de1d80689e6f07ebdff44b2 WatchSource:0}: Error finding container b71ba310eacbddfdf1c82866e2ba412a2319817c5de1d80689e6f07ebdff44b2: Status 404 returned error can't find the container with id b71ba310eacbddfdf1c82866e2ba412a2319817c5de1d80689e6f07ebdff44b2 Apr 24 19:24:37.634045 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.634013 2573 generic.go:358] "Generic (PLEG): container finished" podID="3a8cdfef-d943-4b8b-8641-ef57122233aa" containerID="80f5a50efd1ec708e45f1e8d45a99295f5f168f37126696b41f2aea03e6e66c2" exitCode=2 Apr 24 19:24:37.634206 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.634076 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" event={"ID":"3a8cdfef-d943-4b8b-8641-ef57122233aa","Type":"ContainerDied","Data":"80f5a50efd1ec708e45f1e8d45a99295f5f168f37126696b41f2aea03e6e66c2"} Apr 24 19:24:37.635568 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.635543 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" event={"ID":"0ddbb448-03b5-46b3-9277-eae73d6103ba","Type":"ContainerStarted","Data":"4e4ed2c6d40cefd8f3cbf22a901b3c3adf4d7494082ddc013ea923c2d6bb8c6b"} Apr 24 19:24:37.635729 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.635573 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" event={"ID":"0ddbb448-03b5-46b3-9277-eae73d6103ba","Type":"ContainerStarted","Data":"fe91d50394f6e03842292dc57e45d6d5be2b752584b2bf16958dc2b603b368e2"} Apr 24 19:24:37.635729 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.635584 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" event={"ID":"0ddbb448-03b5-46b3-9277-eae73d6103ba","Type":"ContainerStarted","Data":"b71ba310eacbddfdf1c82866e2ba412a2319817c5de1d80689e6f07ebdff44b2"} Apr 24 19:24:37.635729 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.635599 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" Apr 24 19:24:37.635729 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.635661 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" Apr 24 19:24:37.636707 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.636682 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" podUID="0ddbb448-03b5-46b3-9277-eae73d6103ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 19:24:37.637116 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.637097 2573 generic.go:358] "Generic (PLEG): container finished" podID="b2a3743c-17ad-490b-b19b-1434fb28d669" containerID="ea407126ed9854bba7dbd2aef0e5ec6e4daf4efacedf58ef269a6e0e6f752e9f" exitCode=2 Apr 24 19:24:37.637194 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.637119 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" event={"ID":"b2a3743c-17ad-490b-b19b-1434fb28d669","Type":"ContainerDied","Data":"ea407126ed9854bba7dbd2aef0e5ec6e4daf4efacedf58ef269a6e0e6f752e9f"} Apr 24 19:24:37.638603 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.638582 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" event={"ID":"6a243971-45ac-410a-9e5d-ef4761931a44","Type":"ContainerStarted","Data":"4e0beabd7aecb511e07c19289ba9032373baa6e8f5920e723eba33a5711a29ec"} Apr 24 19:24:37.638696 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.638609 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" event={"ID":"6a243971-45ac-410a-9e5d-ef4761931a44","Type":"ContainerStarted","Data":"db085332cf3777b37ee39bb77e602acf9c35b7c097f7c2eb404bf8b769a148e9"} Apr 24 19:24:37.638696 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.638622 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" event={"ID":"6a243971-45ac-410a-9e5d-ef4761931a44","Type":"ContainerStarted","Data":"194c24ffdff86cf109256d49946a33379c1aa01eaa6d675197817b86bdfae4a1"} Apr 24 19:24:37.638810 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.638752 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" Apr 24 19:24:37.655552 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.655514 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" podStartSLOduration=1.655503561 podStartE2EDuration="1.655503561s" podCreationTimestamp="2026-04-24 19:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:24:37.65296276 +0000 UTC m=+1114.770303162" watchObservedRunningTime="2026-04-24 19:24:37.655503561 +0000 UTC m=+1114.772843949" Apr 24 19:24:37.672100 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:37.672046 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" podStartSLOduration=1.672029583 podStartE2EDuration="1.672029583s" podCreationTimestamp="2026-04-24 19:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:24:37.67009562 +0000 UTC m=+1114.787436021" watchObservedRunningTime="2026-04-24 19:24:37.672029583 +0000 UTC m=+1114.789369973" Apr 24 19:24:38.641852 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:38.641813 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" Apr 24 19:24:38.642327 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:38.642062 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" podUID="0ddbb448-03b5-46b3-9277-eae73d6103ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 19:24:38.643232 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:38.643209 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" podUID="6a243971-45ac-410a-9e5d-ef4761931a44" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 19:24:39.645763 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:39.645727 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" podUID="6a243971-45ac-410a-9e5d-ef4761931a44" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 19:24:40.055007 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.054962 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" podUID="3a8cdfef-d943-4b8b-8641-ef57122233aa" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.25:8643/healthz\": dial tcp 10.133.0.25:8643: connect: connection refused" Apr 24 19:24:40.055195 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.054962 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" podUID="b2a3743c-17ad-490b-b19b-1434fb28d669" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.24:8643/healthz\": dial tcp 10.133.0.24:8643: connect: connection refused" Apr 24 19:24:40.060415 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.060388 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" podUID="3a8cdfef-d943-4b8b-8641-ef57122233aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 19:24:40.060566 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.060396 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" podUID="b2a3743c-17ad-490b-b19b-1434fb28d669" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 19:24:40.183081 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.183056 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" Apr 24 19:24:40.292522 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.292486 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a8cdfef-d943-4b8b-8641-ef57122233aa-proxy-tls\") pod \"3a8cdfef-d943-4b8b-8641-ef57122233aa\" (UID: \"3a8cdfef-d943-4b8b-8641-ef57122233aa\") " Apr 24 19:24:40.292678 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.292531 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-d5820-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3a8cdfef-d943-4b8b-8641-ef57122233aa-error-404-isvc-d5820-kube-rbac-proxy-sar-config\") pod \"3a8cdfef-d943-4b8b-8641-ef57122233aa\" (UID: \"3a8cdfef-d943-4b8b-8641-ef57122233aa\") " Apr 24 19:24:40.292678 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.292558 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxjvc\" (UniqueName: \"kubernetes.io/projected/3a8cdfef-d943-4b8b-8641-ef57122233aa-kube-api-access-mxjvc\") pod \"3a8cdfef-d943-4b8b-8641-ef57122233aa\" (UID: \"3a8cdfef-d943-4b8b-8641-ef57122233aa\") " Apr 24 19:24:40.292916 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.292888 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a8cdfef-d943-4b8b-8641-ef57122233aa-error-404-isvc-d5820-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-d5820-kube-rbac-proxy-sar-config") pod "3a8cdfef-d943-4b8b-8641-ef57122233aa" (UID: "3a8cdfef-d943-4b8b-8641-ef57122233aa"). InnerVolumeSpecName "error-404-isvc-d5820-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:24:40.294813 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.294788 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a8cdfef-d943-4b8b-8641-ef57122233aa-kube-api-access-mxjvc" (OuterVolumeSpecName: "kube-api-access-mxjvc") pod "3a8cdfef-d943-4b8b-8641-ef57122233aa" (UID: "3a8cdfef-d943-4b8b-8641-ef57122233aa"). InnerVolumeSpecName "kube-api-access-mxjvc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:24:40.294889 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.294803 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a8cdfef-d943-4b8b-8641-ef57122233aa-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3a8cdfef-d943-4b8b-8641-ef57122233aa" (UID: "3a8cdfef-d943-4b8b-8641-ef57122233aa"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:24:40.394073 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.394037 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a8cdfef-d943-4b8b-8641-ef57122233aa-proxy-tls\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:24:40.394073 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.394069 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-d5820-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3a8cdfef-d943-4b8b-8641-ef57122233aa-error-404-isvc-d5820-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:24:40.394073 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.394081 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mxjvc\" (UniqueName: \"kubernetes.io/projected/3a8cdfef-d943-4b8b-8641-ef57122233aa-kube-api-access-mxjvc\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:24:40.650413 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.650384 2573 generic.go:358] "Generic (PLEG): container finished" podID="b2a3743c-17ad-490b-b19b-1434fb28d669" containerID="dcbe55b26169184d87746f52a9b58e940f8ddf1905a0e453167bdb19cdc06416" exitCode=0 Apr 24 19:24:40.650825 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.650539 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" event={"ID":"b2a3743c-17ad-490b-b19b-1434fb28d669","Type":"ContainerDied","Data":"dcbe55b26169184d87746f52a9b58e940f8ddf1905a0e453167bdb19cdc06416"} Apr 24 19:24:40.652048 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.652017 2573 generic.go:358] "Generic (PLEG): container finished" podID="3a8cdfef-d943-4b8b-8641-ef57122233aa" containerID="f3e2694159c02b9e84fc6bc65a1b2446112581f3e7cdfbedf3ea780754afcaaf" exitCode=0 Apr 24 19:24:40.652166 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.652059 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" event={"ID":"3a8cdfef-d943-4b8b-8641-ef57122233aa","Type":"ContainerDied","Data":"f3e2694159c02b9e84fc6bc65a1b2446112581f3e7cdfbedf3ea780754afcaaf"} Apr 24 19:24:40.652166 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.652088 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" event={"ID":"3a8cdfef-d943-4b8b-8641-ef57122233aa","Type":"ContainerDied","Data":"4a673b1c612ca6f83416462eb1832519cf777ff953748dab1ab9d0f0929f9562"} Apr 24 19:24:40.652166 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.652097 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx" Apr 24 19:24:40.652166 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.652108 2573 scope.go:117] "RemoveContainer" containerID="80f5a50efd1ec708e45f1e8d45a99295f5f168f37126696b41f2aea03e6e66c2" Apr 24 19:24:40.661049 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.661028 2573 scope.go:117] "RemoveContainer" containerID="f3e2694159c02b9e84fc6bc65a1b2446112581f3e7cdfbedf3ea780754afcaaf" Apr 24 19:24:40.670048 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.670014 2573 scope.go:117] "RemoveContainer" containerID="80f5a50efd1ec708e45f1e8d45a99295f5f168f37126696b41f2aea03e6e66c2" Apr 24 19:24:40.670315 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:24:40.670293 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80f5a50efd1ec708e45f1e8d45a99295f5f168f37126696b41f2aea03e6e66c2\": container with ID starting with 80f5a50efd1ec708e45f1e8d45a99295f5f168f37126696b41f2aea03e6e66c2 not found: ID does not exist" containerID="80f5a50efd1ec708e45f1e8d45a99295f5f168f37126696b41f2aea03e6e66c2" Apr 24 19:24:40.670400 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.670325 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80f5a50efd1ec708e45f1e8d45a99295f5f168f37126696b41f2aea03e6e66c2"} err="failed to get container status \"80f5a50efd1ec708e45f1e8d45a99295f5f168f37126696b41f2aea03e6e66c2\": rpc error: code = NotFound desc = could not find container \"80f5a50efd1ec708e45f1e8d45a99295f5f168f37126696b41f2aea03e6e66c2\": container with ID starting with 80f5a50efd1ec708e45f1e8d45a99295f5f168f37126696b41f2aea03e6e66c2 not found: ID does not exist" Apr 24 19:24:40.670400 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.670343 2573 scope.go:117] "RemoveContainer" containerID="f3e2694159c02b9e84fc6bc65a1b2446112581f3e7cdfbedf3ea780754afcaaf" Apr 24 19:24:40.670656 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:24:40.670632 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3e2694159c02b9e84fc6bc65a1b2446112581f3e7cdfbedf3ea780754afcaaf\": container with ID starting with f3e2694159c02b9e84fc6bc65a1b2446112581f3e7cdfbedf3ea780754afcaaf not found: ID does not exist" containerID="f3e2694159c02b9e84fc6bc65a1b2446112581f3e7cdfbedf3ea780754afcaaf" Apr 24 19:24:40.670722 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.670671 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3e2694159c02b9e84fc6bc65a1b2446112581f3e7cdfbedf3ea780754afcaaf"} err="failed to get container status \"f3e2694159c02b9e84fc6bc65a1b2446112581f3e7cdfbedf3ea780754afcaaf\": rpc error: code = NotFound desc = could not find container \"f3e2694159c02b9e84fc6bc65a1b2446112581f3e7cdfbedf3ea780754afcaaf\": container with ID starting with f3e2694159c02b9e84fc6bc65a1b2446112581f3e7cdfbedf3ea780754afcaaf not found: ID does not exist" Apr 24 19:24:40.674419 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.674395 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx"] Apr 24 19:24:40.678003 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.677971 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d5820-predictor-5b7d74d57d-j44tx"] Apr 24 19:24:40.704295 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.704277 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" Apr 24 19:24:40.797936 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.797902 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b2a3743c-17ad-490b-b19b-1434fb28d669-proxy-tls\") pod \"b2a3743c-17ad-490b-b19b-1434fb28d669\" (UID: \"b2a3743c-17ad-490b-b19b-1434fb28d669\") " Apr 24 19:24:40.798116 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.797962 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6z6d\" (UniqueName: \"kubernetes.io/projected/b2a3743c-17ad-490b-b19b-1434fb28d669-kube-api-access-c6z6d\") pod \"b2a3743c-17ad-490b-b19b-1434fb28d669\" (UID: \"b2a3743c-17ad-490b-b19b-1434fb28d669\") " Apr 24 19:24:40.798116 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.797989 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-d5820-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b2a3743c-17ad-490b-b19b-1434fb28d669-success-200-isvc-d5820-kube-rbac-proxy-sar-config\") pod \"b2a3743c-17ad-490b-b19b-1434fb28d669\" (UID: \"b2a3743c-17ad-490b-b19b-1434fb28d669\") " Apr 24 19:24:40.798370 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.798349 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2a3743c-17ad-490b-b19b-1434fb28d669-success-200-isvc-d5820-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-d5820-kube-rbac-proxy-sar-config") pod "b2a3743c-17ad-490b-b19b-1434fb28d669" (UID: "b2a3743c-17ad-490b-b19b-1434fb28d669"). InnerVolumeSpecName "success-200-isvc-d5820-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:24:40.800150 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.800126 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2a3743c-17ad-490b-b19b-1434fb28d669-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b2a3743c-17ad-490b-b19b-1434fb28d669" (UID: "b2a3743c-17ad-490b-b19b-1434fb28d669"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:24:40.800249 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.800160 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2a3743c-17ad-490b-b19b-1434fb28d669-kube-api-access-c6z6d" (OuterVolumeSpecName: "kube-api-access-c6z6d") pod "b2a3743c-17ad-490b-b19b-1434fb28d669" (UID: "b2a3743c-17ad-490b-b19b-1434fb28d669"). InnerVolumeSpecName "kube-api-access-c6z6d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:24:40.899402 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.899321 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b2a3743c-17ad-490b-b19b-1434fb28d669-proxy-tls\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:24:40.899402 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.899351 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c6z6d\" (UniqueName: \"kubernetes.io/projected/b2a3743c-17ad-490b-b19b-1434fb28d669-kube-api-access-c6z6d\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:24:40.899402 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:40.899361 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-d5820-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b2a3743c-17ad-490b-b19b-1434fb28d669-success-200-isvc-d5820-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:24:41.380072 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:41.380036 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a8cdfef-d943-4b8b-8641-ef57122233aa" path="/var/lib/kubelet/pods/3a8cdfef-d943-4b8b-8641-ef57122233aa/volumes" Apr 24 19:24:41.656898 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:41.656814 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" event={"ID":"b2a3743c-17ad-490b-b19b-1434fb28d669","Type":"ContainerDied","Data":"70c092b38764eaab2f08bd16e514192f921b73f336e79b7d808ee52a8952a916"} Apr 24 19:24:41.656898 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:41.656872 2573 scope.go:117] "RemoveContainer" containerID="ea407126ed9854bba7dbd2aef0e5ec6e4daf4efacedf58ef269a6e0e6f752e9f" Apr 24 19:24:41.656898 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:41.656882 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv" Apr 24 19:24:41.666007 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:41.665988 2573 scope.go:117] "RemoveContainer" containerID="dcbe55b26169184d87746f52a9b58e940f8ddf1905a0e453167bdb19cdc06416" Apr 24 19:24:41.674792 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:41.674771 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv"] Apr 24 19:24:41.679093 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:41.679062 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d5820-predictor-5c958b5f7c-995rv"] Apr 24 19:24:43.379312 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:43.379277 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2a3743c-17ad-490b-b19b-1434fb28d669" path="/var/lib/kubelet/pods/b2a3743c-17ad-490b-b19b-1434fb28d669/volumes" Apr 24 19:24:43.646632 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:43.646555 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" Apr 24 19:24:43.647089 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:43.647066 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" podUID="0ddbb448-03b5-46b3-9277-eae73d6103ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 19:24:44.650644 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:44.650618 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" Apr 24 19:24:44.651166 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:44.651141 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" podUID="6a243971-45ac-410a-9e5d-ef4761931a44" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 19:24:53.647276 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:53.647238 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" podUID="0ddbb448-03b5-46b3-9277-eae73d6103ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 19:24:54.651372 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:24:54.651334 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" podUID="6a243971-45ac-410a-9e5d-ef4761931a44" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 19:25:03.647943 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:03.647858 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" podUID="0ddbb448-03b5-46b3-9277-eae73d6103ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 19:25:04.651853 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:04.651810 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" podUID="6a243971-45ac-410a-9e5d-ef4761931a44" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 19:25:13.647689 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:13.647652 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" podUID="0ddbb448-03b5-46b3-9277-eae73d6103ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 19:25:14.651766 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:14.651728 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" podUID="6a243971-45ac-410a-9e5d-ef4761931a44" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 19:25:16.731660 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.731630 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57"] Apr 24 19:25:16.732108 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.731986 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" podUID="60633c68-5404-4495-8e49-01e91e0f62ba" containerName="kserve-container" containerID="cri-o://fab901caed66c5969d98d94879a2f893db3faa9fe856e973a61511bc6f08c601" gracePeriod=30 Apr 24 19:25:16.732177 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.732134 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" podUID="60633c68-5404-4495-8e49-01e91e0f62ba" containerName="kube-rbac-proxy" containerID="cri-o://e16346909356b708963277905ffc1ded30ae0692e9f56b2b7e2f1e76ec94dc37" gracePeriod=30 Apr 24 19:25:16.775511 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.775478 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6"] Apr 24 19:25:16.775794 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.775756 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" podUID="b9c97214-f5d6-411a-9961-05c62b416798" containerName="kserve-container" containerID="cri-o://70682350c8a14cf7403127990530bd52e8f2a01bc959fef4a6f02188724430f7" gracePeriod=30 Apr 24 19:25:16.775869 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.775790 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" podUID="b9c97214-f5d6-411a-9961-05c62b416798" containerName="kube-rbac-proxy" containerID="cri-o://56fc129c2948eb2c4e0624898b4bb25d4370e01ae21ff21b16dd54b357787ae1" gracePeriod=30 Apr 24 19:25:16.813630 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.813600 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm"] Apr 24 19:25:16.813950 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.813935 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2a3743c-17ad-490b-b19b-1434fb28d669" containerName="kube-rbac-proxy" Apr 24 19:25:16.813997 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.813953 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2a3743c-17ad-490b-b19b-1434fb28d669" containerName="kube-rbac-proxy" Apr 24 19:25:16.813997 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.813963 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2a3743c-17ad-490b-b19b-1434fb28d669" containerName="kserve-container" Apr 24 19:25:16.813997 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.813968 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2a3743c-17ad-490b-b19b-1434fb28d669" containerName="kserve-container" Apr 24 19:25:16.813997 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.813977 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a8cdfef-d943-4b8b-8641-ef57122233aa" containerName="kube-rbac-proxy" Apr 24 19:25:16.813997 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.813982 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a8cdfef-d943-4b8b-8641-ef57122233aa" containerName="kube-rbac-proxy" Apr 24 19:25:16.813997 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.813991 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a8cdfef-d943-4b8b-8641-ef57122233aa" containerName="kserve-container" Apr 24 19:25:16.813997 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.813995 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a8cdfef-d943-4b8b-8641-ef57122233aa" containerName="kserve-container" Apr 24 19:25:16.814191 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.814041 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b2a3743c-17ad-490b-b19b-1434fb28d669" containerName="kube-rbac-proxy" Apr 24 19:25:16.814191 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.814049 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b2a3743c-17ad-490b-b19b-1434fb28d669" containerName="kserve-container" Apr 24 19:25:16.814191 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.814056 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a8cdfef-d943-4b8b-8641-ef57122233aa" containerName="kserve-container" Apr 24 19:25:16.814191 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.814062 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a8cdfef-d943-4b8b-8641-ef57122233aa" containerName="kube-rbac-proxy" Apr 24 19:25:16.818558 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.818543 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" Apr 24 19:25:16.820642 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.820620 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-443ff-predictor-serving-cert\"" Apr 24 19:25:16.820735 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.820678 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-443ff-kube-rbac-proxy-sar-config\"" Apr 24 19:25:16.823678 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.823659 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm"] Apr 24 19:25:16.892727 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.892702 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4"] Apr 24 19:25:16.895933 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.895916 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" Apr 24 19:25:16.898373 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.898351 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-443ff-predictor-serving-cert\"" Apr 24 19:25:16.898500 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.898379 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-443ff-kube-rbac-proxy-sar-config\"" Apr 24 19:25:16.901746 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.901721 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-443ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cba2074e-1a62-4c76-bbe3-4f68b1110ff0-error-404-isvc-443ff-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-443ff-predictor-6f4fd75694-x5jf4\" (UID: \"cba2074e-1a62-4c76-bbe3-4f68b1110ff0\") " pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" Apr 24 19:25:16.901841 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.901772 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzsd2\" (UniqueName: \"kubernetes.io/projected/cba2074e-1a62-4c76-bbe3-4f68b1110ff0-kube-api-access-kzsd2\") pod \"error-404-isvc-443ff-predictor-6f4fd75694-x5jf4\" (UID: \"cba2074e-1a62-4c76-bbe3-4f68b1110ff0\") " pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" Apr 24 19:25:16.901897 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.901840 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-443ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4cde8196-8e50-4150-a320-a610187ae842-success-200-isvc-443ff-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-443ff-predictor-f8c98b6df-98lqm\" (UID: \"4cde8196-8e50-4150-a320-a610187ae842\") " pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" Apr 24 19:25:16.901897 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.901878 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnbrh\" (UniqueName: \"kubernetes.io/projected/4cde8196-8e50-4150-a320-a610187ae842-kube-api-access-mnbrh\") pod \"success-200-isvc-443ff-predictor-f8c98b6df-98lqm\" (UID: \"4cde8196-8e50-4150-a320-a610187ae842\") " pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" Apr 24 19:25:16.901990 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.901940 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cde8196-8e50-4150-a320-a610187ae842-proxy-tls\") pod \"success-200-isvc-443ff-predictor-f8c98b6df-98lqm\" (UID: \"4cde8196-8e50-4150-a320-a610187ae842\") " pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" Apr 24 19:25:16.902044 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.901990 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cba2074e-1a62-4c76-bbe3-4f68b1110ff0-proxy-tls\") pod \"error-404-isvc-443ff-predictor-6f4fd75694-x5jf4\" (UID: \"cba2074e-1a62-4c76-bbe3-4f68b1110ff0\") " pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" Apr 24 19:25:16.907593 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:16.907574 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4"] Apr 24 19:25:17.002418 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.002392 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cba2074e-1a62-4c76-bbe3-4f68b1110ff0-proxy-tls\") pod \"error-404-isvc-443ff-predictor-6f4fd75694-x5jf4\" (UID: \"cba2074e-1a62-4c76-bbe3-4f68b1110ff0\") " pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" Apr 24 19:25:17.002628 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.002452 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-443ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cba2074e-1a62-4c76-bbe3-4f68b1110ff0-error-404-isvc-443ff-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-443ff-predictor-6f4fd75694-x5jf4\" (UID: \"cba2074e-1a62-4c76-bbe3-4f68b1110ff0\") " pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" Apr 24 19:25:17.002628 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.002486 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzsd2\" (UniqueName: \"kubernetes.io/projected/cba2074e-1a62-4c76-bbe3-4f68b1110ff0-kube-api-access-kzsd2\") pod \"error-404-isvc-443ff-predictor-6f4fd75694-x5jf4\" (UID: \"cba2074e-1a62-4c76-bbe3-4f68b1110ff0\") " pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" Apr 24 19:25:17.002628 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.002509 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-443ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4cde8196-8e50-4150-a320-a610187ae842-success-200-isvc-443ff-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-443ff-predictor-f8c98b6df-98lqm\" (UID: \"4cde8196-8e50-4150-a320-a610187ae842\") " pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" Apr 24 19:25:17.002628 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.002526 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnbrh\" (UniqueName: \"kubernetes.io/projected/4cde8196-8e50-4150-a320-a610187ae842-kube-api-access-mnbrh\") pod \"success-200-isvc-443ff-predictor-f8c98b6df-98lqm\" (UID: \"4cde8196-8e50-4150-a320-a610187ae842\") " pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" Apr 24 19:25:17.002628 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.002571 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cde8196-8e50-4150-a320-a610187ae842-proxy-tls\") pod \"success-200-isvc-443ff-predictor-f8c98b6df-98lqm\" (UID: \"4cde8196-8e50-4150-a320-a610187ae842\") " pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" Apr 24 19:25:17.003211 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.003171 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-443ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4cde8196-8e50-4150-a320-a610187ae842-success-200-isvc-443ff-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-443ff-predictor-f8c98b6df-98lqm\" (UID: \"4cde8196-8e50-4150-a320-a610187ae842\") " pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" Apr 24 19:25:17.003321 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.003208 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-443ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cba2074e-1a62-4c76-bbe3-4f68b1110ff0-error-404-isvc-443ff-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-443ff-predictor-6f4fd75694-x5jf4\" (UID: \"cba2074e-1a62-4c76-bbe3-4f68b1110ff0\") " pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" Apr 24 19:25:17.005012 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.004995 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cba2074e-1a62-4c76-bbe3-4f68b1110ff0-proxy-tls\") pod \"error-404-isvc-443ff-predictor-6f4fd75694-x5jf4\" (UID: \"cba2074e-1a62-4c76-bbe3-4f68b1110ff0\") " pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" Apr 24 19:25:17.005060 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.005036 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cde8196-8e50-4150-a320-a610187ae842-proxy-tls\") pod \"success-200-isvc-443ff-predictor-f8c98b6df-98lqm\" (UID: \"4cde8196-8e50-4150-a320-a610187ae842\") " pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" Apr 24 19:25:17.011688 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.011669 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnbrh\" (UniqueName: \"kubernetes.io/projected/4cde8196-8e50-4150-a320-a610187ae842-kube-api-access-mnbrh\") pod \"success-200-isvc-443ff-predictor-f8c98b6df-98lqm\" (UID: \"4cde8196-8e50-4150-a320-a610187ae842\") " pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" Apr 24 19:25:17.011792 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.011668 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzsd2\" (UniqueName: \"kubernetes.io/projected/cba2074e-1a62-4c76-bbe3-4f68b1110ff0-kube-api-access-kzsd2\") pod \"error-404-isvc-443ff-predictor-6f4fd75694-x5jf4\" (UID: \"cba2074e-1a62-4c76-bbe3-4f68b1110ff0\") " pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" Apr 24 19:25:17.129576 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.129533 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" Apr 24 19:25:17.207104 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.206293 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" Apr 24 19:25:17.256124 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.255327 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm"] Apr 24 19:25:17.258457 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:25:17.258404 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cde8196_8e50_4150_a320_a610187ae842.slice/crio-5cb2d45bd4333f2ab1200baff880ad04c05e4e9c4fa4c4621163b2d77db8617c WatchSource:0}: Error finding container 5cb2d45bd4333f2ab1200baff880ad04c05e4e9c4fa4c4621163b2d77db8617c: Status 404 returned error can't find the container with id 5cb2d45bd4333f2ab1200baff880ad04c05e4e9c4fa4c4621163b2d77db8617c Apr 24 19:25:17.336403 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.336382 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4"] Apr 24 19:25:17.338363 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:25:17.338337 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcba2074e_1a62_4c76_bbe3_4f68b1110ff0.slice/crio-85bcd4ed197b2ce74bdfcc1f5d2196925e12716d08a721f1f79504a1c5a8b1ee WatchSource:0}: Error finding container 85bcd4ed197b2ce74bdfcc1f5d2196925e12716d08a721f1f79504a1c5a8b1ee: Status 404 returned error can't find the container with id 85bcd4ed197b2ce74bdfcc1f5d2196925e12716d08a721f1f79504a1c5a8b1ee Apr 24 19:25:17.770059 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.770022 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" event={"ID":"cba2074e-1a62-4c76-bbe3-4f68b1110ff0","Type":"ContainerStarted","Data":"cbab6697a273aae82d007e6148d70b329e08290500cf7fcb073bdef7ce1775f8"} Apr 24 19:25:17.770059 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.770067 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" event={"ID":"cba2074e-1a62-4c76-bbe3-4f68b1110ff0","Type":"ContainerStarted","Data":"d2f68f5a7c9c863ba4538b696952b522a8fe16251c31b2d35d717017c942205b"} Apr 24 19:25:17.770625 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.770079 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" event={"ID":"cba2074e-1a62-4c76-bbe3-4f68b1110ff0","Type":"ContainerStarted","Data":"85bcd4ed197b2ce74bdfcc1f5d2196925e12716d08a721f1f79504a1c5a8b1ee"} Apr 24 19:25:17.772480 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.771083 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" Apr 24 19:25:17.772480 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.771114 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" Apr 24 19:25:17.772660 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.772618 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" podUID="cba2074e-1a62-4c76-bbe3-4f68b1110ff0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 19:25:17.772792 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.772767 2573 generic.go:358] "Generic (PLEG): container finished" podID="60633c68-5404-4495-8e49-01e91e0f62ba" containerID="e16346909356b708963277905ffc1ded30ae0692e9f56b2b7e2f1e76ec94dc37" exitCode=2 Apr 24 19:25:17.772896 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.772838 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" event={"ID":"60633c68-5404-4495-8e49-01e91e0f62ba","Type":"ContainerDied","Data":"e16346909356b708963277905ffc1ded30ae0692e9f56b2b7e2f1e76ec94dc37"} Apr 24 19:25:17.774808 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.774788 2573 generic.go:358] "Generic (PLEG): container finished" podID="b9c97214-f5d6-411a-9961-05c62b416798" containerID="56fc129c2948eb2c4e0624898b4bb25d4370e01ae21ff21b16dd54b357787ae1" exitCode=2 Apr 24 19:25:17.774902 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.774871 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" event={"ID":"b9c97214-f5d6-411a-9961-05c62b416798","Type":"ContainerDied","Data":"56fc129c2948eb2c4e0624898b4bb25d4370e01ae21ff21b16dd54b357787ae1"} Apr 24 19:25:17.776341 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.776323 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" event={"ID":"4cde8196-8e50-4150-a320-a610187ae842","Type":"ContainerStarted","Data":"8372952d92d88a457ec447675e6c0d75eb6f2d2563b6addfceed465d89c861c2"} Apr 24 19:25:17.776402 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.776347 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" event={"ID":"4cde8196-8e50-4150-a320-a610187ae842","Type":"ContainerStarted","Data":"a0b579adaedbe136a7a03658f6cd1df63aab75d7871610b3a7558ea1f607574d"} Apr 24 19:25:17.776402 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.776357 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" event={"ID":"4cde8196-8e50-4150-a320-a610187ae842","Type":"ContainerStarted","Data":"5cb2d45bd4333f2ab1200baff880ad04c05e4e9c4fa4c4621163b2d77db8617c"} Apr 24 19:25:17.776540 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.776527 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" Apr 24 19:25:17.787632 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.787596 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" podStartSLOduration=1.787582947 podStartE2EDuration="1.787582947s" podCreationTimestamp="2026-04-24 19:25:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:25:17.78672295 +0000 UTC m=+1154.904063347" watchObservedRunningTime="2026-04-24 19:25:17.787582947 +0000 UTC m=+1154.904923335" Apr 24 19:25:17.804102 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:17.804031 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" podStartSLOduration=1.8040195890000001 podStartE2EDuration="1.804019589s" podCreationTimestamp="2026-04-24 19:25:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:25:17.802342576 +0000 UTC m=+1154.919682965" watchObservedRunningTime="2026-04-24 19:25:17.804019589 +0000 UTC m=+1154.921359973" Apr 24 19:25:18.779668 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:18.779633 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" podUID="cba2074e-1a62-4c76-bbe3-4f68b1110ff0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 19:25:18.780069 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:18.779724 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" Apr 24 19:25:18.780806 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:18.780784 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" podUID="4cde8196-8e50-4150-a320-a610187ae842" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 19:25:19.782553 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:19.782517 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" podUID="cba2074e-1a62-4c76-bbe3-4f68b1110ff0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 19:25:19.782921 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:19.782517 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" podUID="4cde8196-8e50-4150-a320-a610187ae842" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 19:25:20.181493 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.181384 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" podUID="b9c97214-f5d6-411a-9961-05c62b416798" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.26:8643/healthz\": dial tcp 10.133.0.26:8643: connect: connection refused" Apr 24 19:25:20.186765 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.186743 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" podUID="b9c97214-f5d6-411a-9961-05c62b416798" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 19:25:20.652143 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.652122 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" Apr 24 19:25:20.733078 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.732889 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-99560-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b9c97214-f5d6-411a-9961-05c62b416798-success-200-isvc-99560-kube-rbac-proxy-sar-config\") pod \"b9c97214-f5d6-411a-9961-05c62b416798\" (UID: \"b9c97214-f5d6-411a-9961-05c62b416798\") " Apr 24 19:25:20.733078 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.732940 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tppl\" (UniqueName: \"kubernetes.io/projected/b9c97214-f5d6-411a-9961-05c62b416798-kube-api-access-4tppl\") pod \"b9c97214-f5d6-411a-9961-05c62b416798\" (UID: \"b9c97214-f5d6-411a-9961-05c62b416798\") " Apr 24 19:25:20.733078 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.733000 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9c97214-f5d6-411a-9961-05c62b416798-proxy-tls\") pod \"b9c97214-f5d6-411a-9961-05c62b416798\" (UID: \"b9c97214-f5d6-411a-9961-05c62b416798\") " Apr 24 19:25:20.733346 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.733235 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9c97214-f5d6-411a-9961-05c62b416798-success-200-isvc-99560-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-99560-kube-rbac-proxy-sar-config") pod "b9c97214-f5d6-411a-9961-05c62b416798" (UID: "b9c97214-f5d6-411a-9961-05c62b416798"). InnerVolumeSpecName "success-200-isvc-99560-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:25:20.735168 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.735142 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9c97214-f5d6-411a-9961-05c62b416798-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b9c97214-f5d6-411a-9961-05c62b416798" (UID: "b9c97214-f5d6-411a-9961-05c62b416798"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:25:20.735303 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.735171 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9c97214-f5d6-411a-9961-05c62b416798-kube-api-access-4tppl" (OuterVolumeSpecName: "kube-api-access-4tppl") pod "b9c97214-f5d6-411a-9961-05c62b416798" (UID: "b9c97214-f5d6-411a-9961-05c62b416798"). InnerVolumeSpecName "kube-api-access-4tppl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:25:20.775821 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.775798 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" Apr 24 19:25:20.786340 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.786309 2573 generic.go:358] "Generic (PLEG): container finished" podID="60633c68-5404-4495-8e49-01e91e0f62ba" containerID="fab901caed66c5969d98d94879a2f893db3faa9fe856e973a61511bc6f08c601" exitCode=0 Apr 24 19:25:20.786730 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.786370 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" Apr 24 19:25:20.786730 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.786393 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" event={"ID":"60633c68-5404-4495-8e49-01e91e0f62ba","Type":"ContainerDied","Data":"fab901caed66c5969d98d94879a2f893db3faa9fe856e973a61511bc6f08c601"} Apr 24 19:25:20.786730 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.786468 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57" event={"ID":"60633c68-5404-4495-8e49-01e91e0f62ba","Type":"ContainerDied","Data":"65c07906e9a0cba0956f1c5985fc3db73188bdec40a78fc34a4b3f54564c66a6"} Apr 24 19:25:20.786730 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.786494 2573 scope.go:117] "RemoveContainer" containerID="e16346909356b708963277905ffc1ded30ae0692e9f56b2b7e2f1e76ec94dc37" Apr 24 19:25:20.787812 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.787788 2573 generic.go:358] "Generic (PLEG): container finished" podID="b9c97214-f5d6-411a-9961-05c62b416798" containerID="70682350c8a14cf7403127990530bd52e8f2a01bc959fef4a6f02188724430f7" exitCode=0 Apr 24 19:25:20.787889 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.787870 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" event={"ID":"b9c97214-f5d6-411a-9961-05c62b416798","Type":"ContainerDied","Data":"70682350c8a14cf7403127990530bd52e8f2a01bc959fef4a6f02188724430f7"} Apr 24 19:25:20.787938 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.787889 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" event={"ID":"b9c97214-f5d6-411a-9961-05c62b416798","Type":"ContainerDied","Data":"338777ab90c16fdb3ed2ab4a24e662f30d60d24fc9777d5176c0aaeaddc10556"} Apr 24 19:25:20.788082 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.788065 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6" Apr 24 19:25:20.796013 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.795991 2573 scope.go:117] "RemoveContainer" containerID="fab901caed66c5969d98d94879a2f893db3faa9fe856e973a61511bc6f08c601" Apr 24 19:25:20.804095 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.804082 2573 scope.go:117] "RemoveContainer" containerID="e16346909356b708963277905ffc1ded30ae0692e9f56b2b7e2f1e76ec94dc37" Apr 24 19:25:20.804314 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:25:20.804299 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e16346909356b708963277905ffc1ded30ae0692e9f56b2b7e2f1e76ec94dc37\": container with ID starting with e16346909356b708963277905ffc1ded30ae0692e9f56b2b7e2f1e76ec94dc37 not found: ID does not exist" containerID="e16346909356b708963277905ffc1ded30ae0692e9f56b2b7e2f1e76ec94dc37" Apr 24 19:25:20.804368 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.804321 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e16346909356b708963277905ffc1ded30ae0692e9f56b2b7e2f1e76ec94dc37"} err="failed to get container status \"e16346909356b708963277905ffc1ded30ae0692e9f56b2b7e2f1e76ec94dc37\": rpc error: code = NotFound desc = could not find container \"e16346909356b708963277905ffc1ded30ae0692e9f56b2b7e2f1e76ec94dc37\": container with ID starting with e16346909356b708963277905ffc1ded30ae0692e9f56b2b7e2f1e76ec94dc37 not found: ID does not exist" Apr 24 19:25:20.804368 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.804335 2573 scope.go:117] "RemoveContainer" containerID="fab901caed66c5969d98d94879a2f893db3faa9fe856e973a61511bc6f08c601" Apr 24 19:25:20.804593 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:25:20.804576 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fab901caed66c5969d98d94879a2f893db3faa9fe856e973a61511bc6f08c601\": container with ID starting with fab901caed66c5969d98d94879a2f893db3faa9fe856e973a61511bc6f08c601 not found: ID does not exist" containerID="fab901caed66c5969d98d94879a2f893db3faa9fe856e973a61511bc6f08c601" Apr 24 19:25:20.804635 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.804600 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fab901caed66c5969d98d94879a2f893db3faa9fe856e973a61511bc6f08c601"} err="failed to get container status \"fab901caed66c5969d98d94879a2f893db3faa9fe856e973a61511bc6f08c601\": rpc error: code = NotFound desc = could not find container \"fab901caed66c5969d98d94879a2f893db3faa9fe856e973a61511bc6f08c601\": container with ID starting with fab901caed66c5969d98d94879a2f893db3faa9fe856e973a61511bc6f08c601 not found: ID does not exist" Apr 24 19:25:20.804635 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.804616 2573 scope.go:117] "RemoveContainer" containerID="56fc129c2948eb2c4e0624898b4bb25d4370e01ae21ff21b16dd54b357787ae1" Apr 24 19:25:20.812110 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.812086 2573 scope.go:117] "RemoveContainer" containerID="70682350c8a14cf7403127990530bd52e8f2a01bc959fef4a6f02188724430f7" Apr 24 19:25:20.812301 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.812285 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6"] Apr 24 19:25:20.814739 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.814717 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-99560-predictor-6f5d77689c-g76x6"] Apr 24 19:25:20.819225 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.819208 2573 scope.go:117] "RemoveContainer" containerID="56fc129c2948eb2c4e0624898b4bb25d4370e01ae21ff21b16dd54b357787ae1" Apr 24 19:25:20.819494 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:25:20.819468 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56fc129c2948eb2c4e0624898b4bb25d4370e01ae21ff21b16dd54b357787ae1\": container with ID starting with 56fc129c2948eb2c4e0624898b4bb25d4370e01ae21ff21b16dd54b357787ae1 not found: ID does not exist" containerID="56fc129c2948eb2c4e0624898b4bb25d4370e01ae21ff21b16dd54b357787ae1" Apr 24 19:25:20.819587 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.819504 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56fc129c2948eb2c4e0624898b4bb25d4370e01ae21ff21b16dd54b357787ae1"} err="failed to get container status \"56fc129c2948eb2c4e0624898b4bb25d4370e01ae21ff21b16dd54b357787ae1\": rpc error: code = NotFound desc = could not find container \"56fc129c2948eb2c4e0624898b4bb25d4370e01ae21ff21b16dd54b357787ae1\": container with ID starting with 56fc129c2948eb2c4e0624898b4bb25d4370e01ae21ff21b16dd54b357787ae1 not found: ID does not exist" Apr 24 19:25:20.819587 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.819531 2573 scope.go:117] "RemoveContainer" containerID="70682350c8a14cf7403127990530bd52e8f2a01bc959fef4a6f02188724430f7" Apr 24 19:25:20.819786 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:25:20.819768 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70682350c8a14cf7403127990530bd52e8f2a01bc959fef4a6f02188724430f7\": container with ID starting with 70682350c8a14cf7403127990530bd52e8f2a01bc959fef4a6f02188724430f7 not found: ID does not exist" containerID="70682350c8a14cf7403127990530bd52e8f2a01bc959fef4a6f02188724430f7" Apr 24 19:25:20.819826 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.819792 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70682350c8a14cf7403127990530bd52e8f2a01bc959fef4a6f02188724430f7"} err="failed to get container status \"70682350c8a14cf7403127990530bd52e8f2a01bc959fef4a6f02188724430f7\": rpc error: code = NotFound desc = could not find container \"70682350c8a14cf7403127990530bd52e8f2a01bc959fef4a6f02188724430f7\": container with ID starting with 70682350c8a14cf7403127990530bd52e8f2a01bc959fef4a6f02188724430f7 not found: ID does not exist" Apr 24 19:25:20.833613 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.833592 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9v6\" (UniqueName: \"kubernetes.io/projected/60633c68-5404-4495-8e49-01e91e0f62ba-kube-api-access-2w9v6\") pod \"60633c68-5404-4495-8e49-01e91e0f62ba\" (UID: \"60633c68-5404-4495-8e49-01e91e0f62ba\") " Apr 24 19:25:20.833692 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.833626 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-99560-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/60633c68-5404-4495-8e49-01e91e0f62ba-error-404-isvc-99560-kube-rbac-proxy-sar-config\") pod \"60633c68-5404-4495-8e49-01e91e0f62ba\" (UID: \"60633c68-5404-4495-8e49-01e91e0f62ba\") " Apr 24 19:25:20.833728 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.833691 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/60633c68-5404-4495-8e49-01e91e0f62ba-proxy-tls\") pod \"60633c68-5404-4495-8e49-01e91e0f62ba\" (UID: \"60633c68-5404-4495-8e49-01e91e0f62ba\") " Apr 24 19:25:20.833850 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.833833 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9c97214-f5d6-411a-9961-05c62b416798-proxy-tls\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:25:20.833907 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.833856 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-99560-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b9c97214-f5d6-411a-9961-05c62b416798-success-200-isvc-99560-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:25:20.833907 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.833872 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4tppl\" (UniqueName: \"kubernetes.io/projected/b9c97214-f5d6-411a-9961-05c62b416798-kube-api-access-4tppl\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:25:20.834057 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.833993 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60633c68-5404-4495-8e49-01e91e0f62ba-error-404-isvc-99560-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-99560-kube-rbac-proxy-sar-config") pod "60633c68-5404-4495-8e49-01e91e0f62ba" (UID: "60633c68-5404-4495-8e49-01e91e0f62ba"). InnerVolumeSpecName "error-404-isvc-99560-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:25:20.835675 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.835655 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60633c68-5404-4495-8e49-01e91e0f62ba-kube-api-access-2w9v6" (OuterVolumeSpecName: "kube-api-access-2w9v6") pod "60633c68-5404-4495-8e49-01e91e0f62ba" (UID: "60633c68-5404-4495-8e49-01e91e0f62ba"). InnerVolumeSpecName "kube-api-access-2w9v6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:25:20.835787 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.835767 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60633c68-5404-4495-8e49-01e91e0f62ba-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "60633c68-5404-4495-8e49-01e91e0f62ba" (UID: "60633c68-5404-4495-8e49-01e91e0f62ba"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:25:20.934369 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.934331 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2w9v6\" (UniqueName: \"kubernetes.io/projected/60633c68-5404-4495-8e49-01e91e0f62ba-kube-api-access-2w9v6\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:25:20.934369 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.934366 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-99560-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/60633c68-5404-4495-8e49-01e91e0f62ba-error-404-isvc-99560-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:25:20.934594 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:20.934382 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/60633c68-5404-4495-8e49-01e91e0f62ba-proxy-tls\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:25:21.106906 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:21.106875 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57"] Apr 24 19:25:21.110067 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:21.110045 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-99560-predictor-59865b4b78-f7h57"] Apr 24 19:25:21.380312 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:21.380240 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60633c68-5404-4495-8e49-01e91e0f62ba" path="/var/lib/kubelet/pods/60633c68-5404-4495-8e49-01e91e0f62ba/volumes" Apr 24 19:25:21.380660 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:21.380646 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9c97214-f5d6-411a-9961-05c62b416798" path="/var/lib/kubelet/pods/b9c97214-f5d6-411a-9961-05c62b416798/volumes" Apr 24 19:25:23.648207 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:23.648171 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" Apr 24 19:25:24.652592 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:24.652566 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" Apr 24 19:25:24.786288 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:24.786262 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" Apr 24 19:25:24.786716 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:24.786696 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" Apr 24 19:25:24.786823 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:24.786785 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" podUID="cba2074e-1a62-4c76-bbe3-4f68b1110ff0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 19:25:24.787067 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:24.787044 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" podUID="4cde8196-8e50-4150-a320-a610187ae842" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 19:25:34.787363 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:34.787321 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" podUID="cba2074e-1a62-4c76-bbe3-4f68b1110ff0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 19:25:34.787758 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:34.787321 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" podUID="4cde8196-8e50-4150-a320-a610187ae842" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 19:25:44.787612 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:44.787576 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" podUID="cba2074e-1a62-4c76-bbe3-4f68b1110ff0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 19:25:44.788097 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:44.787577 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" podUID="4cde8196-8e50-4150-a320-a610187ae842" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 19:25:47.230345 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.230264 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs"] Apr 24 19:25:47.230799 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.230563 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" podUID="6a243971-45ac-410a-9e5d-ef4761931a44" containerName="kserve-container" containerID="cri-o://db085332cf3777b37ee39bb77e602acf9c35b7c097f7c2eb404bf8b769a148e9" gracePeriod=30 Apr 24 19:25:47.230799 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.230626 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" podUID="6a243971-45ac-410a-9e5d-ef4761931a44" containerName="kube-rbac-proxy" containerID="cri-o://4e0beabd7aecb511e07c19289ba9032373baa6e8f5920e723eba33a5711a29ec" gracePeriod=30 Apr 24 19:25:47.292406 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.292374 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct"] Apr 24 19:25:47.292899 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.292880 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="60633c68-5404-4495-8e49-01e91e0f62ba" containerName="kube-rbac-proxy" Apr 24 19:25:47.292976 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.292902 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="60633c68-5404-4495-8e49-01e91e0f62ba" containerName="kube-rbac-proxy" Apr 24 19:25:47.292976 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.292930 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="60633c68-5404-4495-8e49-01e91e0f62ba" containerName="kserve-container" Apr 24 19:25:47.292976 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.292939 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="60633c68-5404-4495-8e49-01e91e0f62ba" containerName="kserve-container" Apr 24 19:25:47.292976 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.292950 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9c97214-f5d6-411a-9961-05c62b416798" containerName="kube-rbac-proxy" Apr 24 19:25:47.292976 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.292958 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9c97214-f5d6-411a-9961-05c62b416798" containerName="kube-rbac-proxy" Apr 24 19:25:47.292976 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.292976 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9c97214-f5d6-411a-9961-05c62b416798" containerName="kserve-container" Apr 24 19:25:47.293280 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.292984 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9c97214-f5d6-411a-9961-05c62b416798" containerName="kserve-container" Apr 24 19:25:47.293280 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.293056 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9c97214-f5d6-411a-9961-05c62b416798" containerName="kserve-container" Apr 24 19:25:47.293280 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.293070 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="60633c68-5404-4495-8e49-01e91e0f62ba" containerName="kserve-container" Apr 24 19:25:47.293280 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.293081 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="60633c68-5404-4495-8e49-01e91e0f62ba" containerName="kube-rbac-proxy" Apr 24 19:25:47.293280 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.293092 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9c97214-f5d6-411a-9961-05c62b416798" containerName="kube-rbac-proxy" Apr 24 19:25:47.299734 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.299712 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" Apr 24 19:25:47.303124 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.303097 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-ec538-predictor-serving-cert\"" Apr 24 19:25:47.303124 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.303102 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-ec538-kube-rbac-proxy-sar-config\"" Apr 24 19:25:47.306775 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.306752 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct"] Apr 24 19:25:47.333629 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.333603 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c"] Apr 24 19:25:47.333924 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.333895 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" podUID="0ddbb448-03b5-46b3-9277-eae73d6103ba" containerName="kserve-container" containerID="cri-o://fe91d50394f6e03842292dc57e45d6d5be2b752584b2bf16958dc2b603b368e2" gracePeriod=30 Apr 24 19:25:47.334050 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.333907 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" podUID="0ddbb448-03b5-46b3-9277-eae73d6103ba" containerName="kube-rbac-proxy" containerID="cri-o://4e4ed2c6d40cefd8f3cbf22a901b3c3adf4d7494082ddc013ea923c2d6bb8c6b" gracePeriod=30 Apr 24 19:25:47.389398 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.389360 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx"] Apr 24 19:25:47.392849 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.392829 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" Apr 24 19:25:47.395466 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.395444 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-ec538-predictor-serving-cert\"" Apr 24 19:25:47.395586 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.395466 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-ec538-kube-rbac-proxy-sar-config\"" Apr 24 19:25:47.406123 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.406098 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx"] Apr 24 19:25:47.447768 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.447745 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-ec538-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dc2ae66b-d959-4e95-9f33-17fb9e2960a1-success-200-isvc-ec538-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-ec538-predictor-767f7c7f5-ljrct\" (UID: \"dc2ae66b-d959-4e95-9f33-17fb9e2960a1\") " pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" Apr 24 19:25:47.447867 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.447795 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dmmg\" (UniqueName: \"kubernetes.io/projected/a6f61d88-8d78-4f16-bc8b-ec61b385dcf6-kube-api-access-9dmmg\") pod \"error-404-isvc-ec538-predictor-5cc86fcf96-24dwx\" (UID: \"a6f61d88-8d78-4f16-bc8b-ec61b385dcf6\") " pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" Apr 24 19:25:47.447867 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.447859 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc2ae66b-d959-4e95-9f33-17fb9e2960a1-proxy-tls\") pod \"success-200-isvc-ec538-predictor-767f7c7f5-ljrct\" (UID: \"dc2ae66b-d959-4e95-9f33-17fb9e2960a1\") " pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" Apr 24 19:25:47.447964 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.447945 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6f61d88-8d78-4f16-bc8b-ec61b385dcf6-proxy-tls\") pod \"error-404-isvc-ec538-predictor-5cc86fcf96-24dwx\" (UID: \"a6f61d88-8d78-4f16-bc8b-ec61b385dcf6\") " pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" Apr 24 19:25:47.448023 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.447980 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27xfn\" (UniqueName: \"kubernetes.io/projected/dc2ae66b-d959-4e95-9f33-17fb9e2960a1-kube-api-access-27xfn\") pod \"success-200-isvc-ec538-predictor-767f7c7f5-ljrct\" (UID: \"dc2ae66b-d959-4e95-9f33-17fb9e2960a1\") " pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" Apr 24 19:25:47.448080 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.448060 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-ec538-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6f61d88-8d78-4f16-bc8b-ec61b385dcf6-error-404-isvc-ec538-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-ec538-predictor-5cc86fcf96-24dwx\" (UID: \"a6f61d88-8d78-4f16-bc8b-ec61b385dcf6\") " pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" Apr 24 19:25:47.549351 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.549310 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc2ae66b-d959-4e95-9f33-17fb9e2960a1-proxy-tls\") pod \"success-200-isvc-ec538-predictor-767f7c7f5-ljrct\" (UID: \"dc2ae66b-d959-4e95-9f33-17fb9e2960a1\") " pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" Apr 24 19:25:47.549564 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.549387 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6f61d88-8d78-4f16-bc8b-ec61b385dcf6-proxy-tls\") pod \"error-404-isvc-ec538-predictor-5cc86fcf96-24dwx\" (UID: \"a6f61d88-8d78-4f16-bc8b-ec61b385dcf6\") " pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" Apr 24 19:25:47.549564 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:25:47.549476 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-ec538-predictor-serving-cert: secret "success-200-isvc-ec538-predictor-serving-cert" not found Apr 24 19:25:47.549564 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.549534 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27xfn\" (UniqueName: \"kubernetes.io/projected/dc2ae66b-d959-4e95-9f33-17fb9e2960a1-kube-api-access-27xfn\") pod \"success-200-isvc-ec538-predictor-767f7c7f5-ljrct\" (UID: \"dc2ae66b-d959-4e95-9f33-17fb9e2960a1\") " pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" Apr 24 19:25:47.549564 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:25:47.549546 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc2ae66b-d959-4e95-9f33-17fb9e2960a1-proxy-tls podName:dc2ae66b-d959-4e95-9f33-17fb9e2960a1 nodeName:}" failed. No retries permitted until 2026-04-24 19:25:48.049530369 +0000 UTC m=+1185.166870734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/dc2ae66b-d959-4e95-9f33-17fb9e2960a1-proxy-tls") pod "success-200-isvc-ec538-predictor-767f7c7f5-ljrct" (UID: "dc2ae66b-d959-4e95-9f33-17fb9e2960a1") : secret "success-200-isvc-ec538-predictor-serving-cert" not found Apr 24 19:25:47.549803 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.549606 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-ec538-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6f61d88-8d78-4f16-bc8b-ec61b385dcf6-error-404-isvc-ec538-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-ec538-predictor-5cc86fcf96-24dwx\" (UID: \"a6f61d88-8d78-4f16-bc8b-ec61b385dcf6\") " pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" Apr 24 19:25:47.549803 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.549642 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-ec538-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dc2ae66b-d959-4e95-9f33-17fb9e2960a1-success-200-isvc-ec538-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-ec538-predictor-767f7c7f5-ljrct\" (UID: \"dc2ae66b-d959-4e95-9f33-17fb9e2960a1\") " pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" Apr 24 19:25:47.549803 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.549681 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dmmg\" (UniqueName: \"kubernetes.io/projected/a6f61d88-8d78-4f16-bc8b-ec61b385dcf6-kube-api-access-9dmmg\") pod \"error-404-isvc-ec538-predictor-5cc86fcf96-24dwx\" (UID: \"a6f61d88-8d78-4f16-bc8b-ec61b385dcf6\") " pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" Apr 24 19:25:47.550369 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.550350 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-ec538-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dc2ae66b-d959-4e95-9f33-17fb9e2960a1-success-200-isvc-ec538-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-ec538-predictor-767f7c7f5-ljrct\" (UID: \"dc2ae66b-d959-4e95-9f33-17fb9e2960a1\") " pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" Apr 24 19:25:47.550538 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.550514 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-ec538-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6f61d88-8d78-4f16-bc8b-ec61b385dcf6-error-404-isvc-ec538-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-ec538-predictor-5cc86fcf96-24dwx\" (UID: \"a6f61d88-8d78-4f16-bc8b-ec61b385dcf6\") " pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" Apr 24 19:25:47.552095 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.552073 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6f61d88-8d78-4f16-bc8b-ec61b385dcf6-proxy-tls\") pod \"error-404-isvc-ec538-predictor-5cc86fcf96-24dwx\" (UID: \"a6f61d88-8d78-4f16-bc8b-ec61b385dcf6\") " pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" Apr 24 19:25:47.558388 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.558368 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dmmg\" (UniqueName: \"kubernetes.io/projected/a6f61d88-8d78-4f16-bc8b-ec61b385dcf6-kube-api-access-9dmmg\") pod \"error-404-isvc-ec538-predictor-5cc86fcf96-24dwx\" (UID: \"a6f61d88-8d78-4f16-bc8b-ec61b385dcf6\") " pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" Apr 24 19:25:47.558511 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.558374 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27xfn\" (UniqueName: \"kubernetes.io/projected/dc2ae66b-d959-4e95-9f33-17fb9e2960a1-kube-api-access-27xfn\") pod \"success-200-isvc-ec538-predictor-767f7c7f5-ljrct\" (UID: \"dc2ae66b-d959-4e95-9f33-17fb9e2960a1\") " pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" Apr 24 19:25:47.702531 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.702490 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" Apr 24 19:25:47.831713 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.831643 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx"] Apr 24 19:25:47.834835 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:25:47.834800 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6f61d88_8d78_4f16_bc8b_ec61b385dcf6.slice/crio-ca5e27026da74ecf538b6bb0c191459f699d2359b0aa0b847eee51b13301fdef WatchSource:0}: Error finding container ca5e27026da74ecf538b6bb0c191459f699d2359b0aa0b847eee51b13301fdef: Status 404 returned error can't find the container with id ca5e27026da74ecf538b6bb0c191459f699d2359b0aa0b847eee51b13301fdef Apr 24 19:25:47.870966 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.870931 2573 generic.go:358] "Generic (PLEG): container finished" podID="0ddbb448-03b5-46b3-9277-eae73d6103ba" containerID="4e4ed2c6d40cefd8f3cbf22a901b3c3adf4d7494082ddc013ea923c2d6bb8c6b" exitCode=2 Apr 24 19:25:47.871089 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.870964 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" event={"ID":"0ddbb448-03b5-46b3-9277-eae73d6103ba","Type":"ContainerDied","Data":"4e4ed2c6d40cefd8f3cbf22a901b3c3adf4d7494082ddc013ea923c2d6bb8c6b"} Apr 24 19:25:47.873116 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.873084 2573 generic.go:358] "Generic (PLEG): container finished" podID="6a243971-45ac-410a-9e5d-ef4761931a44" containerID="4e0beabd7aecb511e07c19289ba9032373baa6e8f5920e723eba33a5711a29ec" exitCode=2 Apr 24 19:25:47.873232 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.873148 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" event={"ID":"6a243971-45ac-410a-9e5d-ef4761931a44","Type":"ContainerDied","Data":"4e0beabd7aecb511e07c19289ba9032373baa6e8f5920e723eba33a5711a29ec"} Apr 24 19:25:47.874222 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:47.874177 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" event={"ID":"a6f61d88-8d78-4f16-bc8b-ec61b385dcf6","Type":"ContainerStarted","Data":"ca5e27026da74ecf538b6bb0c191459f699d2359b0aa0b847eee51b13301fdef"} Apr 24 19:25:48.053729 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:48.053694 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc2ae66b-d959-4e95-9f33-17fb9e2960a1-proxy-tls\") pod \"success-200-isvc-ec538-predictor-767f7c7f5-ljrct\" (UID: \"dc2ae66b-d959-4e95-9f33-17fb9e2960a1\") " pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" Apr 24 19:25:48.056182 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:48.056162 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc2ae66b-d959-4e95-9f33-17fb9e2960a1-proxy-tls\") pod \"success-200-isvc-ec538-predictor-767f7c7f5-ljrct\" (UID: \"dc2ae66b-d959-4e95-9f33-17fb9e2960a1\") " pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" Apr 24 19:25:48.211824 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:48.211747 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" Apr 24 19:25:48.336060 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:48.335805 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct"] Apr 24 19:25:48.338357 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:25:48.338326 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc2ae66b_d959_4e95_9f33_17fb9e2960a1.slice/crio-b26830c739d40cda8856e407b8a13fb70b6fbfd0470c2ac39bc7bd5e48d8c680 WatchSource:0}: Error finding container b26830c739d40cda8856e407b8a13fb70b6fbfd0470c2ac39bc7bd5e48d8c680: Status 404 returned error can't find the container with id b26830c739d40cda8856e407b8a13fb70b6fbfd0470c2ac39bc7bd5e48d8c680 Apr 24 19:25:48.642680 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:48.642639 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" podUID="0ddbb448-03b5-46b3-9277-eae73d6103ba" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.29:8643/healthz\": dial tcp 10.133.0.29:8643: connect: connection refused" Apr 24 19:25:48.879602 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:48.879566 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" event={"ID":"dc2ae66b-d959-4e95-9f33-17fb9e2960a1","Type":"ContainerStarted","Data":"204b2f54abfc24fabf11ecd3c2d8410a40d3ae812ca0b757901b6d2b18596e50"} Apr 24 19:25:48.879815 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:48.879610 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" event={"ID":"dc2ae66b-d959-4e95-9f33-17fb9e2960a1","Type":"ContainerStarted","Data":"d38f735d7591d61cfdacfed0ea62742d2c5a12970b1e613d8168aeac51ac5f50"} Apr 24 19:25:48.879815 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:48.879734 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" Apr 24 19:25:48.879932 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:48.879854 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" Apr 24 19:25:48.879932 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:48.879874 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" event={"ID":"dc2ae66b-d959-4e95-9f33-17fb9e2960a1","Type":"ContainerStarted","Data":"b26830c739d40cda8856e407b8a13fb70b6fbfd0470c2ac39bc7bd5e48d8c680"} Apr 24 19:25:48.881249 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:48.881214 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" podUID="dc2ae66b-d959-4e95-9f33-17fb9e2960a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 19:25:48.881393 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:48.881317 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" event={"ID":"a6f61d88-8d78-4f16-bc8b-ec61b385dcf6","Type":"ContainerStarted","Data":"44bed989e1ed4cf40f355246ee81f588110591ce8abb00e16e9e5c008c12febf"} Apr 24 19:25:48.881393 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:48.881342 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" event={"ID":"a6f61d88-8d78-4f16-bc8b-ec61b385dcf6","Type":"ContainerStarted","Data":"d918d035f2fd0525481c6803e28bf458dd15fcc9b2ad2acd0a06e3d436b95834"} Apr 24 19:25:48.881513 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:48.881496 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" Apr 24 19:25:48.915722 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:48.915672 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" podStartSLOduration=1.915659089 podStartE2EDuration="1.915659089s" podCreationTimestamp="2026-04-24 19:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:25:48.915345108 +0000 UTC m=+1186.032685507" watchObservedRunningTime="2026-04-24 19:25:48.915659089 +0000 UTC m=+1186.032999476" Apr 24 19:25:48.934873 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:48.934827 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" podStartSLOduration=1.934813152 podStartE2EDuration="1.934813152s" podCreationTimestamp="2026-04-24 19:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:25:48.933021161 +0000 UTC m=+1186.050361551" watchObservedRunningTime="2026-04-24 19:25:48.934813152 +0000 UTC m=+1186.052153540" Apr 24 19:25:49.647037 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:49.646995 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" podUID="6a243971-45ac-410a-9e5d-ef4761931a44" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.28:8643/healthz\": dial tcp 10.133.0.28:8643: connect: connection refused" Apr 24 19:25:49.884474 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:49.884420 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" Apr 24 19:25:49.884659 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:49.884582 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" podUID="dc2ae66b-d959-4e95-9f33-17fb9e2960a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 19:25:49.885667 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:49.885642 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" podUID="a6f61d88-8d78-4f16-bc8b-ec61b385dcf6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 19:25:50.795510 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.795482 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" Apr 24 19:25:50.798621 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.798603 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" Apr 24 19:25:50.879418 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.879328 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhkfb\" (UniqueName: \"kubernetes.io/projected/0ddbb448-03b5-46b3-9277-eae73d6103ba-kube-api-access-jhkfb\") pod \"0ddbb448-03b5-46b3-9277-eae73d6103ba\" (UID: \"0ddbb448-03b5-46b3-9277-eae73d6103ba\") " Apr 24 19:25:50.879418 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.879374 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ddbb448-03b5-46b3-9277-eae73d6103ba-proxy-tls\") pod \"0ddbb448-03b5-46b3-9277-eae73d6103ba\" (UID: \"0ddbb448-03b5-46b3-9277-eae73d6103ba\") " Apr 24 19:25:50.879418 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.879409 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-b2e49-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0ddbb448-03b5-46b3-9277-eae73d6103ba-error-404-isvc-b2e49-kube-rbac-proxy-sar-config\") pod \"0ddbb448-03b5-46b3-9277-eae73d6103ba\" (UID: \"0ddbb448-03b5-46b3-9277-eae73d6103ba\") " Apr 24 19:25:50.879721 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.879490 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a243971-45ac-410a-9e5d-ef4761931a44-proxy-tls\") pod \"6a243971-45ac-410a-9e5d-ef4761931a44\" (UID: \"6a243971-45ac-410a-9e5d-ef4761931a44\") " Apr 24 19:25:50.879721 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.879540 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-b2e49-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a243971-45ac-410a-9e5d-ef4761931a44-success-200-isvc-b2e49-kube-rbac-proxy-sar-config\") pod \"6a243971-45ac-410a-9e5d-ef4761931a44\" (UID: \"6a243971-45ac-410a-9e5d-ef4761931a44\") " Apr 24 19:25:50.879721 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.879569 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lq2x\" (UniqueName: \"kubernetes.io/projected/6a243971-45ac-410a-9e5d-ef4761931a44-kube-api-access-9lq2x\") pod \"6a243971-45ac-410a-9e5d-ef4761931a44\" (UID: \"6a243971-45ac-410a-9e5d-ef4761931a44\") " Apr 24 19:25:50.879876 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.879851 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ddbb448-03b5-46b3-9277-eae73d6103ba-error-404-isvc-b2e49-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-b2e49-kube-rbac-proxy-sar-config") pod "0ddbb448-03b5-46b3-9277-eae73d6103ba" (UID: "0ddbb448-03b5-46b3-9277-eae73d6103ba"). InnerVolumeSpecName "error-404-isvc-b2e49-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:25:50.879985 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.879952 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a243971-45ac-410a-9e5d-ef4761931a44-success-200-isvc-b2e49-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-b2e49-kube-rbac-proxy-sar-config") pod "6a243971-45ac-410a-9e5d-ef4761931a44" (UID: "6a243971-45ac-410a-9e5d-ef4761931a44"). InnerVolumeSpecName "success-200-isvc-b2e49-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:25:50.880262 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.880168 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-b2e49-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a243971-45ac-410a-9e5d-ef4761931a44-success-200-isvc-b2e49-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:25:50.880577 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.880553 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-b2e49-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0ddbb448-03b5-46b3-9277-eae73d6103ba-error-404-isvc-b2e49-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:25:50.881912 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.881883 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ddbb448-03b5-46b3-9277-eae73d6103ba-kube-api-access-jhkfb" (OuterVolumeSpecName: "kube-api-access-jhkfb") pod "0ddbb448-03b5-46b3-9277-eae73d6103ba" (UID: "0ddbb448-03b5-46b3-9277-eae73d6103ba"). InnerVolumeSpecName "kube-api-access-jhkfb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:25:50.881912 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.881885 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a243971-45ac-410a-9e5d-ef4761931a44-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6a243971-45ac-410a-9e5d-ef4761931a44" (UID: "6a243971-45ac-410a-9e5d-ef4761931a44"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:25:50.882030 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.881912 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a243971-45ac-410a-9e5d-ef4761931a44-kube-api-access-9lq2x" (OuterVolumeSpecName: "kube-api-access-9lq2x") pod "6a243971-45ac-410a-9e5d-ef4761931a44" (UID: "6a243971-45ac-410a-9e5d-ef4761931a44"). InnerVolumeSpecName "kube-api-access-9lq2x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:25:50.882078 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.882064 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ddbb448-03b5-46b3-9277-eae73d6103ba-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0ddbb448-03b5-46b3-9277-eae73d6103ba" (UID: "0ddbb448-03b5-46b3-9277-eae73d6103ba"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:25:50.888962 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.888855 2573 generic.go:358] "Generic (PLEG): container finished" podID="0ddbb448-03b5-46b3-9277-eae73d6103ba" containerID="fe91d50394f6e03842292dc57e45d6d5be2b752584b2bf16958dc2b603b368e2" exitCode=0 Apr 24 19:25:50.888962 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.888889 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" event={"ID":"0ddbb448-03b5-46b3-9277-eae73d6103ba","Type":"ContainerDied","Data":"fe91d50394f6e03842292dc57e45d6d5be2b752584b2bf16958dc2b603b368e2"} Apr 24 19:25:50.888962 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.888938 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" event={"ID":"0ddbb448-03b5-46b3-9277-eae73d6103ba","Type":"ContainerDied","Data":"b71ba310eacbddfdf1c82866e2ba412a2319817c5de1d80689e6f07ebdff44b2"} Apr 24 19:25:50.888962 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.888954 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c" Apr 24 19:25:50.889353 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.888961 2573 scope.go:117] "RemoveContainer" containerID="4e4ed2c6d40cefd8f3cbf22a901b3c3adf4d7494082ddc013ea923c2d6bb8c6b" Apr 24 19:25:50.891015 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.890991 2573 generic.go:358] "Generic (PLEG): container finished" podID="6a243971-45ac-410a-9e5d-ef4761931a44" containerID="db085332cf3777b37ee39bb77e602acf9c35b7c097f7c2eb404bf8b769a148e9" exitCode=0 Apr 24 19:25:50.891114 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.891036 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" event={"ID":"6a243971-45ac-410a-9e5d-ef4761931a44","Type":"ContainerDied","Data":"db085332cf3777b37ee39bb77e602acf9c35b7c097f7c2eb404bf8b769a148e9"} Apr 24 19:25:50.891114 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.891068 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" Apr 24 19:25:50.891232 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.891070 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs" event={"ID":"6a243971-45ac-410a-9e5d-ef4761931a44","Type":"ContainerDied","Data":"194c24ffdff86cf109256d49946a33379c1aa01eaa6d675197817b86bdfae4a1"} Apr 24 19:25:50.891659 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.891630 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" podUID="a6f61d88-8d78-4f16-bc8b-ec61b385dcf6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 19:25:50.900747 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.900719 2573 scope.go:117] "RemoveContainer" containerID="fe91d50394f6e03842292dc57e45d6d5be2b752584b2bf16958dc2b603b368e2" Apr 24 19:25:50.908273 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.908249 2573 scope.go:117] "RemoveContainer" containerID="4e4ed2c6d40cefd8f3cbf22a901b3c3adf4d7494082ddc013ea923c2d6bb8c6b" Apr 24 19:25:50.908550 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:25:50.908529 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e4ed2c6d40cefd8f3cbf22a901b3c3adf4d7494082ddc013ea923c2d6bb8c6b\": container with ID starting with 4e4ed2c6d40cefd8f3cbf22a901b3c3adf4d7494082ddc013ea923c2d6bb8c6b not found: ID does not exist" containerID="4e4ed2c6d40cefd8f3cbf22a901b3c3adf4d7494082ddc013ea923c2d6bb8c6b" Apr 24 19:25:50.908611 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.908558 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e4ed2c6d40cefd8f3cbf22a901b3c3adf4d7494082ddc013ea923c2d6bb8c6b"} err="failed to get container status \"4e4ed2c6d40cefd8f3cbf22a901b3c3adf4d7494082ddc013ea923c2d6bb8c6b\": rpc error: code = NotFound desc = could not find container \"4e4ed2c6d40cefd8f3cbf22a901b3c3adf4d7494082ddc013ea923c2d6bb8c6b\": container with ID starting with 4e4ed2c6d40cefd8f3cbf22a901b3c3adf4d7494082ddc013ea923c2d6bb8c6b not found: ID does not exist" Apr 24 19:25:50.908611 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.908575 2573 scope.go:117] "RemoveContainer" containerID="fe91d50394f6e03842292dc57e45d6d5be2b752584b2bf16958dc2b603b368e2" Apr 24 19:25:50.908820 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:25:50.908804 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe91d50394f6e03842292dc57e45d6d5be2b752584b2bf16958dc2b603b368e2\": container with ID starting with fe91d50394f6e03842292dc57e45d6d5be2b752584b2bf16958dc2b603b368e2 not found: ID does not exist" containerID="fe91d50394f6e03842292dc57e45d6d5be2b752584b2bf16958dc2b603b368e2" Apr 24 19:25:50.908866 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.908824 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe91d50394f6e03842292dc57e45d6d5be2b752584b2bf16958dc2b603b368e2"} err="failed to get container status \"fe91d50394f6e03842292dc57e45d6d5be2b752584b2bf16958dc2b603b368e2\": rpc error: code = NotFound desc = could not find container \"fe91d50394f6e03842292dc57e45d6d5be2b752584b2bf16958dc2b603b368e2\": container with ID starting with fe91d50394f6e03842292dc57e45d6d5be2b752584b2bf16958dc2b603b368e2 not found: ID does not exist" Apr 24 19:25:50.908866 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.908842 2573 scope.go:117] "RemoveContainer" containerID="4e0beabd7aecb511e07c19289ba9032373baa6e8f5920e723eba33a5711a29ec" Apr 24 19:25:50.917229 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.917084 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs"] Apr 24 19:25:50.917585 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.917473 2573 scope.go:117] "RemoveContainer" containerID="db085332cf3777b37ee39bb77e602acf9c35b7c097f7c2eb404bf8b769a148e9" Apr 24 19:25:50.919638 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.919619 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b2e49-predictor-869fccf77f-86ffs"] Apr 24 19:25:50.926349 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.926334 2573 scope.go:117] "RemoveContainer" containerID="4e0beabd7aecb511e07c19289ba9032373baa6e8f5920e723eba33a5711a29ec" Apr 24 19:25:50.926608 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:25:50.926586 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e0beabd7aecb511e07c19289ba9032373baa6e8f5920e723eba33a5711a29ec\": container with ID starting with 4e0beabd7aecb511e07c19289ba9032373baa6e8f5920e723eba33a5711a29ec not found: ID does not exist" containerID="4e0beabd7aecb511e07c19289ba9032373baa6e8f5920e723eba33a5711a29ec" Apr 24 19:25:50.926681 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.926616 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e0beabd7aecb511e07c19289ba9032373baa6e8f5920e723eba33a5711a29ec"} err="failed to get container status \"4e0beabd7aecb511e07c19289ba9032373baa6e8f5920e723eba33a5711a29ec\": rpc error: code = NotFound desc = could not find container \"4e0beabd7aecb511e07c19289ba9032373baa6e8f5920e723eba33a5711a29ec\": container with ID starting with 4e0beabd7aecb511e07c19289ba9032373baa6e8f5920e723eba33a5711a29ec not found: ID does not exist" Apr 24 19:25:50.926681 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.926634 2573 scope.go:117] "RemoveContainer" containerID="db085332cf3777b37ee39bb77e602acf9c35b7c097f7c2eb404bf8b769a148e9" Apr 24 19:25:50.926961 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:25:50.926866 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db085332cf3777b37ee39bb77e602acf9c35b7c097f7c2eb404bf8b769a148e9\": container with ID starting with db085332cf3777b37ee39bb77e602acf9c35b7c097f7c2eb404bf8b769a148e9 not found: ID does not exist" containerID="db085332cf3777b37ee39bb77e602acf9c35b7c097f7c2eb404bf8b769a148e9" Apr 24 19:25:50.926961 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.926894 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db085332cf3777b37ee39bb77e602acf9c35b7c097f7c2eb404bf8b769a148e9"} err="failed to get container status \"db085332cf3777b37ee39bb77e602acf9c35b7c097f7c2eb404bf8b769a148e9\": rpc error: code = NotFound desc = could not find container \"db085332cf3777b37ee39bb77e602acf9c35b7c097f7c2eb404bf8b769a148e9\": container with ID starting with db085332cf3777b37ee39bb77e602acf9c35b7c097f7c2eb404bf8b769a148e9 not found: ID does not exist" Apr 24 19:25:50.929020 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.929000 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c"] Apr 24 19:25:50.934219 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.934197 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2e49-predictor-5cc74f54d8-s2p4c"] Apr 24 19:25:50.981990 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.981952 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jhkfb\" (UniqueName: \"kubernetes.io/projected/0ddbb448-03b5-46b3-9277-eae73d6103ba-kube-api-access-jhkfb\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:25:50.981990 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.981990 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ddbb448-03b5-46b3-9277-eae73d6103ba-proxy-tls\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:25:50.982200 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.982007 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a243971-45ac-410a-9e5d-ef4761931a44-proxy-tls\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:25:50.982200 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:50.982020 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9lq2x\" (UniqueName: \"kubernetes.io/projected/6a243971-45ac-410a-9e5d-ef4761931a44-kube-api-access-9lq2x\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:25:51.379881 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:51.379849 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ddbb448-03b5-46b3-9277-eae73d6103ba" path="/var/lib/kubelet/pods/0ddbb448-03b5-46b3-9277-eae73d6103ba/volumes" Apr 24 19:25:51.380257 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:51.380245 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a243971-45ac-410a-9e5d-ef4761931a44" path="/var/lib/kubelet/pods/6a243971-45ac-410a-9e5d-ef4761931a44/volumes" Apr 24 19:25:54.787064 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:54.787019 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" podUID="cba2074e-1a62-4c76-bbe3-4f68b1110ff0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 19:25:54.787545 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:54.787127 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" podUID="4cde8196-8e50-4150-a320-a610187ae842" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 19:25:54.888473 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:54.888426 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" Apr 24 19:25:54.888962 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:54.888935 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" podUID="dc2ae66b-d959-4e95-9f33-17fb9e2960a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 19:25:55.895543 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:55.895517 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" Apr 24 19:25:55.896051 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:25:55.896024 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" podUID="a6f61d88-8d78-4f16-bc8b-ec61b385dcf6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 19:26:04.787408 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:04.787374 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" Apr 24 19:26:04.788199 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:04.788176 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" Apr 24 19:26:04.889120 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:04.889086 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" podUID="dc2ae66b-d959-4e95-9f33-17fb9e2960a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 19:26:05.895994 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:05.895956 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" podUID="a6f61d88-8d78-4f16-bc8b-ec61b385dcf6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 19:26:14.889672 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:14.889629 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" podUID="dc2ae66b-d959-4e95-9f33-17fb9e2960a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 19:26:15.896018 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:15.895984 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" podUID="a6f61d88-8d78-4f16-bc8b-ec61b385dcf6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 19:26:24.889717 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:24.889679 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" podUID="dc2ae66b-d959-4e95-9f33-17fb9e2960a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 19:26:25.896765 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:25.896726 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" podUID="a6f61d88-8d78-4f16-bc8b-ec61b385dcf6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 19:26:26.960453 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:26.957374 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4"] Apr 24 19:26:26.960453 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:26.957945 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" podUID="cba2074e-1a62-4c76-bbe3-4f68b1110ff0" containerName="kserve-container" containerID="cri-o://d2f68f5a7c9c863ba4538b696952b522a8fe16251c31b2d35d717017c942205b" gracePeriod=30 Apr 24 19:26:26.960453 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:26.958112 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" podUID="cba2074e-1a62-4c76-bbe3-4f68b1110ff0" containerName="kube-rbac-proxy" containerID="cri-o://cbab6697a273aae82d007e6148d70b329e08290500cf7fcb073bdef7ce1775f8" gracePeriod=30 Apr 24 19:26:27.009418 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.009385 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm"] Apr 24 19:26:27.009756 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.009728 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" podUID="4cde8196-8e50-4150-a320-a610187ae842" containerName="kserve-container" containerID="cri-o://a0b579adaedbe136a7a03658f6cd1df63aab75d7871610b3a7558ea1f607574d" gracePeriod=30 Apr 24 19:26:27.009851 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.009788 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" podUID="4cde8196-8e50-4150-a320-a610187ae842" containerName="kube-rbac-proxy" containerID="cri-o://8372952d92d88a457ec447675e6c0d75eb6f2d2563b6addfceed465d89c861c2" gracePeriod=30 Apr 24 19:26:27.032896 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.032872 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd"] Apr 24 19:26:27.033202 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.033190 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a243971-45ac-410a-9e5d-ef4761931a44" containerName="kserve-container" Apr 24 19:26:27.033244 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.033204 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a243971-45ac-410a-9e5d-ef4761931a44" containerName="kserve-container" Apr 24 19:26:27.033244 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.033213 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ddbb448-03b5-46b3-9277-eae73d6103ba" containerName="kserve-container" Apr 24 19:26:27.033244 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.033219 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ddbb448-03b5-46b3-9277-eae73d6103ba" containerName="kserve-container" Apr 24 19:26:27.033244 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.033227 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a243971-45ac-410a-9e5d-ef4761931a44" containerName="kube-rbac-proxy" Apr 24 19:26:27.033244 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.033233 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a243971-45ac-410a-9e5d-ef4761931a44" containerName="kube-rbac-proxy" Apr 24 19:26:27.033244 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.033239 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ddbb448-03b5-46b3-9277-eae73d6103ba" containerName="kube-rbac-proxy" Apr 24 19:26:27.033244 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.033244 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ddbb448-03b5-46b3-9277-eae73d6103ba" containerName="kube-rbac-proxy" Apr 24 19:26:27.033482 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.033301 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ddbb448-03b5-46b3-9277-eae73d6103ba" containerName="kserve-container" Apr 24 19:26:27.033482 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.033308 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a243971-45ac-410a-9e5d-ef4761931a44" containerName="kube-rbac-proxy" Apr 24 19:26:27.033482 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.033317 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ddbb448-03b5-46b3-9277-eae73d6103ba" containerName="kube-rbac-proxy" Apr 24 19:26:27.033482 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.033324 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a243971-45ac-410a-9e5d-ef4761931a44" containerName="kserve-container" Apr 24 19:26:27.036242 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.036227 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" Apr 24 19:26:27.038749 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.038726 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-cb3b2-predictor-serving-cert\"" Apr 24 19:26:27.038860 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.038737 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-cb3b2-kube-rbac-proxy-sar-config\"" Apr 24 19:26:27.046270 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.046248 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd"] Apr 24 19:26:27.066409 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.066380 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5lz5\" (UniqueName: \"kubernetes.io/projected/6d5cedad-0251-4e89-96b5-e205fbec161d-kube-api-access-m5lz5\") pod \"success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd\" (UID: \"6d5cedad-0251-4e89-96b5-e205fbec161d\") " pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" Apr 24 19:26:27.066558 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.066451 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-cb3b2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6d5cedad-0251-4e89-96b5-e205fbec161d-success-200-isvc-cb3b2-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd\" (UID: \"6d5cedad-0251-4e89-96b5-e205fbec161d\") " pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" Apr 24 19:26:27.066626 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.066589 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d5cedad-0251-4e89-96b5-e205fbec161d-proxy-tls\") pod \"success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd\" (UID: \"6d5cedad-0251-4e89-96b5-e205fbec161d\") " pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" Apr 24 19:26:27.126271 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.126239 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7"] Apr 24 19:26:27.129476 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.129460 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" Apr 24 19:26:27.131968 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.131948 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-cb3b2-kube-rbac-proxy-sar-config\"" Apr 24 19:26:27.132081 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.131948 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-cb3b2-predictor-serving-cert\"" Apr 24 19:26:27.140221 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.140200 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7"] Apr 24 19:26:27.167638 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.167604 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-cb3b2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6d5cedad-0251-4e89-96b5-e205fbec161d-success-200-isvc-cb3b2-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd\" (UID: \"6d5cedad-0251-4e89-96b5-e205fbec161d\") " pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" Apr 24 19:26:27.167764 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.167642 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fcf9162-136e-4823-b6a2-285c935c7f14-proxy-tls\") pod \"error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7\" (UID: \"2fcf9162-136e-4823-b6a2-285c935c7f14\") " pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" Apr 24 19:26:27.167764 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.167713 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d5cedad-0251-4e89-96b5-e205fbec161d-proxy-tls\") pod \"success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd\" (UID: \"6d5cedad-0251-4e89-96b5-e205fbec161d\") " pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" Apr 24 19:26:27.167924 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:26:27.167800 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-serving-cert: secret "success-200-isvc-cb3b2-predictor-serving-cert" not found Apr 24 19:26:27.167924 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.167800 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s7b8\" (UniqueName: \"kubernetes.io/projected/2fcf9162-136e-4823-b6a2-285c935c7f14-kube-api-access-8s7b8\") pod \"error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7\" (UID: \"2fcf9162-136e-4823-b6a2-285c935c7f14\") " pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" Apr 24 19:26:27.167924 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:26:27.167860 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d5cedad-0251-4e89-96b5-e205fbec161d-proxy-tls podName:6d5cedad-0251-4e89-96b5-e205fbec161d nodeName:}" failed. No retries permitted until 2026-04-24 19:26:27.667838647 +0000 UTC m=+1224.785179017 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6d5cedad-0251-4e89-96b5-e205fbec161d-proxy-tls") pod "success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" (UID: "6d5cedad-0251-4e89-96b5-e205fbec161d") : secret "success-200-isvc-cb3b2-predictor-serving-cert" not found Apr 24 19:26:27.167924 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.167906 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5lz5\" (UniqueName: \"kubernetes.io/projected/6d5cedad-0251-4e89-96b5-e205fbec161d-kube-api-access-m5lz5\") pod \"success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd\" (UID: \"6d5cedad-0251-4e89-96b5-e205fbec161d\") " pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" Apr 24 19:26:27.168122 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.167934 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-cb3b2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fcf9162-136e-4823-b6a2-285c935c7f14-error-404-isvc-cb3b2-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7\" (UID: \"2fcf9162-136e-4823-b6a2-285c935c7f14\") " pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" Apr 24 19:26:27.168670 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.168652 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-cb3b2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6d5cedad-0251-4e89-96b5-e205fbec161d-success-200-isvc-cb3b2-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd\" (UID: \"6d5cedad-0251-4e89-96b5-e205fbec161d\") " pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" Apr 24 19:26:27.179818 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.179793 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5lz5\" (UniqueName: \"kubernetes.io/projected/6d5cedad-0251-4e89-96b5-e205fbec161d-kube-api-access-m5lz5\") pod \"success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd\" (UID: \"6d5cedad-0251-4e89-96b5-e205fbec161d\") " pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" Apr 24 19:26:27.268812 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.268772 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8s7b8\" (UniqueName: \"kubernetes.io/projected/2fcf9162-136e-4823-b6a2-285c935c7f14-kube-api-access-8s7b8\") pod \"error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7\" (UID: \"2fcf9162-136e-4823-b6a2-285c935c7f14\") " pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" Apr 24 19:26:27.269002 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.268818 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-cb3b2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fcf9162-136e-4823-b6a2-285c935c7f14-error-404-isvc-cb3b2-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7\" (UID: \"2fcf9162-136e-4823-b6a2-285c935c7f14\") " pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" Apr 24 19:26:27.269002 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.268851 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fcf9162-136e-4823-b6a2-285c935c7f14-proxy-tls\") pod \"error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7\" (UID: \"2fcf9162-136e-4823-b6a2-285c935c7f14\") " pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" Apr 24 19:26:27.269627 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.269602 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-cb3b2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fcf9162-136e-4823-b6a2-285c935c7f14-error-404-isvc-cb3b2-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7\" (UID: \"2fcf9162-136e-4823-b6a2-285c935c7f14\") " pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" Apr 24 19:26:27.271548 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.271523 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fcf9162-136e-4823-b6a2-285c935c7f14-proxy-tls\") pod \"error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7\" (UID: \"2fcf9162-136e-4823-b6a2-285c935c7f14\") " pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" Apr 24 19:26:27.276570 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.276548 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s7b8\" (UniqueName: \"kubernetes.io/projected/2fcf9162-136e-4823-b6a2-285c935c7f14-kube-api-access-8s7b8\") pod \"error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7\" (UID: \"2fcf9162-136e-4823-b6a2-285c935c7f14\") " pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" Apr 24 19:26:27.440585 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.440542 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" Apr 24 19:26:27.558309 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.558268 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7"] Apr 24 19:26:27.560537 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:26:27.560507 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fcf9162_136e_4823_b6a2_285c935c7f14.slice/crio-e494371747a7c216491e448be261ff5a9fda397d47b249169680ddf7eb968b5b WatchSource:0}: Error finding container e494371747a7c216491e448be261ff5a9fda397d47b249169680ddf7eb968b5b: Status 404 returned error can't find the container with id e494371747a7c216491e448be261ff5a9fda397d47b249169680ddf7eb968b5b Apr 24 19:26:27.672324 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.672290 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d5cedad-0251-4e89-96b5-e205fbec161d-proxy-tls\") pod \"success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd\" (UID: \"6d5cedad-0251-4e89-96b5-e205fbec161d\") " pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" Apr 24 19:26:27.674669 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.674649 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d5cedad-0251-4e89-96b5-e205fbec161d-proxy-tls\") pod \"success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd\" (UID: \"6d5cedad-0251-4e89-96b5-e205fbec161d\") " pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" Apr 24 19:26:27.946804 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:27.946718 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" Apr 24 19:26:28.006520 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:28.006490 2573 generic.go:358] "Generic (PLEG): container finished" podID="cba2074e-1a62-4c76-bbe3-4f68b1110ff0" containerID="cbab6697a273aae82d007e6148d70b329e08290500cf7fcb073bdef7ce1775f8" exitCode=2 Apr 24 19:26:28.006919 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:28.006563 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" event={"ID":"cba2074e-1a62-4c76-bbe3-4f68b1110ff0","Type":"ContainerDied","Data":"cbab6697a273aae82d007e6148d70b329e08290500cf7fcb073bdef7ce1775f8"} Apr 24 19:26:28.008446 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:28.008303 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" event={"ID":"2fcf9162-136e-4823-b6a2-285c935c7f14","Type":"ContainerStarted","Data":"5ff9812c45d24feca51ab0b43b0f0e496cb8d83958be478905cb3ea80b83eb9e"} Apr 24 19:26:28.008446 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:28.008335 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" event={"ID":"2fcf9162-136e-4823-b6a2-285c935c7f14","Type":"ContainerStarted","Data":"e80df61a0759f1f1e206bd1e2a74c4afb87268d73203d6d50d53947c8fa59ee4"} Apr 24 19:26:28.008446 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:28.008352 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" event={"ID":"2fcf9162-136e-4823-b6a2-285c935c7f14","Type":"ContainerStarted","Data":"e494371747a7c216491e448be261ff5a9fda397d47b249169680ddf7eb968b5b"} Apr 24 19:26:28.008446 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:28.008461 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" Apr 24 19:26:28.010117 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:28.010094 2573 generic.go:358] "Generic (PLEG): container finished" podID="4cde8196-8e50-4150-a320-a610187ae842" containerID="8372952d92d88a457ec447675e6c0d75eb6f2d2563b6addfceed465d89c861c2" exitCode=2 Apr 24 19:26:28.010248 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:28.010144 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" event={"ID":"4cde8196-8e50-4150-a320-a610187ae842","Type":"ContainerDied","Data":"8372952d92d88a457ec447675e6c0d75eb6f2d2563b6addfceed465d89c861c2"} Apr 24 19:26:28.028509 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:28.028407 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" podStartSLOduration=1.028390675 podStartE2EDuration="1.028390675s" podCreationTimestamp="2026-04-24 19:26:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:26:28.0253389 +0000 UTC m=+1225.142679290" watchObservedRunningTime="2026-04-24 19:26:28.028390675 +0000 UTC m=+1225.145731065" Apr 24 19:26:28.068910 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:28.068868 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd"] Apr 24 19:26:28.073408 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:26:28.073359 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d5cedad_0251_4e89_96b5_e205fbec161d.slice/crio-59289d453d16b67e723ffbad957213e81b5f4f7c0ec556eae17b452f9b21e77f WatchSource:0}: Error finding container 59289d453d16b67e723ffbad957213e81b5f4f7c0ec556eae17b452f9b21e77f: Status 404 returned error can't find the container with id 59289d453d16b67e723ffbad957213e81b5f4f7c0ec556eae17b452f9b21e77f Apr 24 19:26:29.015003 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:29.014970 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" event={"ID":"6d5cedad-0251-4e89-96b5-e205fbec161d","Type":"ContainerStarted","Data":"55ebcb48a91019e34e6918e115fb90195c5ec7af9cc745ee5c296ee0b4aa5864"} Apr 24 19:26:29.015003 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:29.015005 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" event={"ID":"6d5cedad-0251-4e89-96b5-e205fbec161d","Type":"ContainerStarted","Data":"bbfbc6a03d02ec73b1c15200889b86042dab8dda0b70065e6a96bff38afbb7a9"} Apr 24 19:26:29.015468 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:29.015029 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" Apr 24 19:26:29.015468 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:29.015038 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" event={"ID":"6d5cedad-0251-4e89-96b5-e205fbec161d","Type":"ContainerStarted","Data":"59289d453d16b67e723ffbad957213e81b5f4f7c0ec556eae17b452f9b21e77f"} Apr 24 19:26:29.015468 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:29.015079 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" Apr 24 19:26:29.016240 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:29.016218 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" podUID="2fcf9162-136e-4823-b6a2-285c935c7f14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 19:26:29.033309 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:29.033272 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" podStartSLOduration=2.033260519 podStartE2EDuration="2.033260519s" podCreationTimestamp="2026-04-24 19:26:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:26:29.032334037 +0000 UTC m=+1226.149674425" watchObservedRunningTime="2026-04-24 19:26:29.033260519 +0000 UTC m=+1226.150600901" Apr 24 19:26:29.782762 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:29.782712 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" podUID="cba2074e-1a62-4c76-bbe3-4f68b1110ff0" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.31:8643/healthz\": dial tcp 10.133.0.31:8643: connect: connection refused" Apr 24 19:26:29.782932 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:29.782712 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" podUID="4cde8196-8e50-4150-a320-a610187ae842" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.30:8643/healthz\": dial tcp 10.133.0.30:8643: connect: connection refused" Apr 24 19:26:30.018600 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:30.018561 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" podUID="2fcf9162-136e-4823-b6a2-285c935c7f14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 19:26:30.019028 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:30.018683 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" Apr 24 19:26:30.019968 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:30.019945 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" podUID="6d5cedad-0251-4e89-96b5-e205fbec161d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 19:26:31.022963 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.022867 2573 generic.go:358] "Generic (PLEG): container finished" podID="cba2074e-1a62-4c76-bbe3-4f68b1110ff0" containerID="d2f68f5a7c9c863ba4538b696952b522a8fe16251c31b2d35d717017c942205b" exitCode=0 Apr 24 19:26:31.022963 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.022943 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" event={"ID":"cba2074e-1a62-4c76-bbe3-4f68b1110ff0","Type":"ContainerDied","Data":"d2f68f5a7c9c863ba4538b696952b522a8fe16251c31b2d35d717017c942205b"} Apr 24 19:26:31.023355 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.022985 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" event={"ID":"cba2074e-1a62-4c76-bbe3-4f68b1110ff0","Type":"ContainerDied","Data":"85bcd4ed197b2ce74bdfcc1f5d2196925e12716d08a721f1f79504a1c5a8b1ee"} Apr 24 19:26:31.023355 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.022999 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85bcd4ed197b2ce74bdfcc1f5d2196925e12716d08a721f1f79504a1c5a8b1ee" Apr 24 19:26:31.025055 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.025028 2573 generic.go:358] "Generic (PLEG): container finished" podID="4cde8196-8e50-4150-a320-a610187ae842" containerID="a0b579adaedbe136a7a03658f6cd1df63aab75d7871610b3a7558ea1f607574d" exitCode=0 Apr 24 19:26:31.025178 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.025098 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" event={"ID":"4cde8196-8e50-4150-a320-a610187ae842","Type":"ContainerDied","Data":"a0b579adaedbe136a7a03658f6cd1df63aab75d7871610b3a7558ea1f607574d"} Apr 24 19:26:31.025536 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.025504 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" podUID="6d5cedad-0251-4e89-96b5-e205fbec161d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 19:26:31.036324 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.036307 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" Apr 24 19:26:31.039634 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.039615 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" Apr 24 19:26:31.100424 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.100393 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnbrh\" (UniqueName: \"kubernetes.io/projected/4cde8196-8e50-4150-a320-a610187ae842-kube-api-access-mnbrh\") pod \"4cde8196-8e50-4150-a320-a610187ae842\" (UID: \"4cde8196-8e50-4150-a320-a610187ae842\") " Apr 24 19:26:31.100593 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.100458 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cba2074e-1a62-4c76-bbe3-4f68b1110ff0-proxy-tls\") pod \"cba2074e-1a62-4c76-bbe3-4f68b1110ff0\" (UID: \"cba2074e-1a62-4c76-bbe3-4f68b1110ff0\") " Apr 24 19:26:31.100593 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.100498 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-443ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cba2074e-1a62-4c76-bbe3-4f68b1110ff0-error-404-isvc-443ff-kube-rbac-proxy-sar-config\") pod \"cba2074e-1a62-4c76-bbe3-4f68b1110ff0\" (UID: \"cba2074e-1a62-4c76-bbe3-4f68b1110ff0\") " Apr 24 19:26:31.100593 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.100514 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cde8196-8e50-4150-a320-a610187ae842-proxy-tls\") pod \"4cde8196-8e50-4150-a320-a610187ae842\" (UID: \"4cde8196-8e50-4150-a320-a610187ae842\") " Apr 24 19:26:31.100593 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.100545 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-443ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4cde8196-8e50-4150-a320-a610187ae842-success-200-isvc-443ff-kube-rbac-proxy-sar-config\") pod \"4cde8196-8e50-4150-a320-a610187ae842\" (UID: \"4cde8196-8e50-4150-a320-a610187ae842\") " Apr 24 19:26:31.100869 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.100603 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzsd2\" (UniqueName: \"kubernetes.io/projected/cba2074e-1a62-4c76-bbe3-4f68b1110ff0-kube-api-access-kzsd2\") pod \"cba2074e-1a62-4c76-bbe3-4f68b1110ff0\" (UID: \"cba2074e-1a62-4c76-bbe3-4f68b1110ff0\") " Apr 24 19:26:31.100869 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.100823 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cba2074e-1a62-4c76-bbe3-4f68b1110ff0-error-404-isvc-443ff-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-443ff-kube-rbac-proxy-sar-config") pod "cba2074e-1a62-4c76-bbe3-4f68b1110ff0" (UID: "cba2074e-1a62-4c76-bbe3-4f68b1110ff0"). InnerVolumeSpecName "error-404-isvc-443ff-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:26:31.100981 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.100917 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-443ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cba2074e-1a62-4c76-bbe3-4f68b1110ff0-error-404-isvc-443ff-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:26:31.100981 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.100958 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cde8196-8e50-4150-a320-a610187ae842-success-200-isvc-443ff-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-443ff-kube-rbac-proxy-sar-config") pod "4cde8196-8e50-4150-a320-a610187ae842" (UID: "4cde8196-8e50-4150-a320-a610187ae842"). InnerVolumeSpecName "success-200-isvc-443ff-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:26:31.102726 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.102700 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cde8196-8e50-4150-a320-a610187ae842-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4cde8196-8e50-4150-a320-a610187ae842" (UID: "4cde8196-8e50-4150-a320-a610187ae842"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:26:31.102820 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.102778 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cba2074e-1a62-4c76-bbe3-4f68b1110ff0-kube-api-access-kzsd2" (OuterVolumeSpecName: "kube-api-access-kzsd2") pod "cba2074e-1a62-4c76-bbe3-4f68b1110ff0" (UID: "cba2074e-1a62-4c76-bbe3-4f68b1110ff0"). InnerVolumeSpecName "kube-api-access-kzsd2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:26:31.103213 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.103185 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba2074e-1a62-4c76-bbe3-4f68b1110ff0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cba2074e-1a62-4c76-bbe3-4f68b1110ff0" (UID: "cba2074e-1a62-4c76-bbe3-4f68b1110ff0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:26:31.103294 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.103188 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cde8196-8e50-4150-a320-a610187ae842-kube-api-access-mnbrh" (OuterVolumeSpecName: "kube-api-access-mnbrh") pod "4cde8196-8e50-4150-a320-a610187ae842" (UID: "4cde8196-8e50-4150-a320-a610187ae842"). InnerVolumeSpecName "kube-api-access-mnbrh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:26:31.201384 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.201345 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kzsd2\" (UniqueName: \"kubernetes.io/projected/cba2074e-1a62-4c76-bbe3-4f68b1110ff0-kube-api-access-kzsd2\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:26:31.201384 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.201380 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mnbrh\" (UniqueName: \"kubernetes.io/projected/4cde8196-8e50-4150-a320-a610187ae842-kube-api-access-mnbrh\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:26:31.201578 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.201394 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cba2074e-1a62-4c76-bbe3-4f68b1110ff0-proxy-tls\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:26:31.201578 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.201403 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cde8196-8e50-4150-a320-a610187ae842-proxy-tls\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:26:31.201578 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:31.201412 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-443ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4cde8196-8e50-4150-a320-a610187ae842-success-200-isvc-443ff-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:26:32.029466 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:32.029364 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" Apr 24 19:26:32.029466 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:32.029388 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm" event={"ID":"4cde8196-8e50-4150-a320-a610187ae842","Type":"ContainerDied","Data":"5cb2d45bd4333f2ab1200baff880ad04c05e4e9c4fa4c4621163b2d77db8617c"} Apr 24 19:26:32.029466 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:32.029404 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4" Apr 24 19:26:32.029466 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:32.029448 2573 scope.go:117] "RemoveContainer" containerID="8372952d92d88a457ec447675e6c0d75eb6f2d2563b6addfceed465d89c861c2" Apr 24 19:26:32.037558 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:32.037541 2573 scope.go:117] "RemoveContainer" containerID="a0b579adaedbe136a7a03658f6cd1df63aab75d7871610b3a7558ea1f607574d" Apr 24 19:26:32.046962 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:32.046938 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4"] Apr 24 19:26:32.050446 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:32.050414 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-443ff-predictor-6f4fd75694-x5jf4"] Apr 24 19:26:32.060262 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:32.060241 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm"] Apr 24 19:26:32.067303 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:32.067278 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-443ff-predictor-f8c98b6df-98lqm"] Apr 24 19:26:33.380575 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:33.380535 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cde8196-8e50-4150-a320-a610187ae842" path="/var/lib/kubelet/pods/4cde8196-8e50-4150-a320-a610187ae842/volumes" Apr 24 19:26:33.381128 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:33.381107 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cba2074e-1a62-4c76-bbe3-4f68b1110ff0" path="/var/lib/kubelet/pods/cba2074e-1a62-4c76-bbe3-4f68b1110ff0/volumes" Apr 24 19:26:34.889633 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:34.889585 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" podUID="dc2ae66b-d959-4e95-9f33-17fb9e2960a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 19:26:35.023161 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:35.023130 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" Apr 24 19:26:35.023713 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:35.023688 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" podUID="2fcf9162-136e-4823-b6a2-285c935c7f14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 19:26:35.896580 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:35.896553 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" Apr 24 19:26:36.029835 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:36.029810 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" Apr 24 19:26:36.030372 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:36.030345 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" podUID="6d5cedad-0251-4e89-96b5-e205fbec161d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 19:26:44.889586 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:44.889554 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" Apr 24 19:26:45.023695 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:45.023650 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" podUID="2fcf9162-136e-4823-b6a2-285c935c7f14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 19:26:46.030882 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:46.030847 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" podUID="6d5cedad-0251-4e89-96b5-e205fbec161d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 19:26:55.023907 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:55.023819 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" podUID="2fcf9162-136e-4823-b6a2-285c935c7f14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 19:26:56.030986 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:26:56.030939 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" podUID="6d5cedad-0251-4e89-96b5-e205fbec161d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 19:27:05.023853 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:27:05.023812 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" podUID="2fcf9162-136e-4823-b6a2-285c935c7f14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 19:27:06.030790 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:27:06.030752 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" podUID="6d5cedad-0251-4e89-96b5-e205fbec161d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 19:27:15.025133 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:27:15.025101 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" Apr 24 19:27:16.031574 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:27:16.031548 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" Apr 24 19:32:03.450194 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:32:03.450159 2573 scope.go:117] "RemoveContainer" containerID="d2f68f5a7c9c863ba4538b696952b522a8fe16251c31b2d35d717017c942205b" Apr 24 19:32:03.458104 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:32:03.458078 2573 scope.go:117] "RemoveContainer" containerID="cbab6697a273aae82d007e6148d70b329e08290500cf7fcb073bdef7ce1775f8" Apr 24 19:35:12.140300 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.140250 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct"] Apr 24 19:35:12.142922 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.140552 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" podUID="dc2ae66b-d959-4e95-9f33-17fb9e2960a1" containerName="kserve-container" containerID="cri-o://d38f735d7591d61cfdacfed0ea62742d2c5a12970b1e613d8168aeac51ac5f50" gracePeriod=30 Apr 24 19:35:12.142922 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.140629 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" podUID="dc2ae66b-d959-4e95-9f33-17fb9e2960a1" containerName="kube-rbac-proxy" containerID="cri-o://204b2f54abfc24fabf11ecd3c2d8410a40d3ae812ca0b757901b6d2b18596e50" gracePeriod=30 Apr 24 19:35:12.197596 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.197566 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx"] Apr 24 19:35:12.197835 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.197812 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" podUID="a6f61d88-8d78-4f16-bc8b-ec61b385dcf6" containerName="kserve-container" containerID="cri-o://d918d035f2fd0525481c6803e28bf458dd15fcc9b2ad2acd0a06e3d436b95834" gracePeriod=30 Apr 24 19:35:12.197930 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.197869 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" podUID="a6f61d88-8d78-4f16-bc8b-ec61b385dcf6" containerName="kube-rbac-proxy" containerID="cri-o://44bed989e1ed4cf40f355246ee81f588110591ce8abb00e16e9e5c008c12febf" gracePeriod=30 Apr 24 19:35:12.215005 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.214975 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn"] Apr 24 19:35:12.215486 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.215463 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4cde8196-8e50-4150-a320-a610187ae842" containerName="kube-rbac-proxy" Apr 24 19:35:12.215613 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.215488 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cde8196-8e50-4150-a320-a610187ae842" containerName="kube-rbac-proxy" Apr 24 19:35:12.215613 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.215516 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4cde8196-8e50-4150-a320-a610187ae842" containerName="kserve-container" Apr 24 19:35:12.215613 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.215525 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cde8196-8e50-4150-a320-a610187ae842" containerName="kserve-container" Apr 24 19:35:12.215613 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.215542 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cba2074e-1a62-4c76-bbe3-4f68b1110ff0" containerName="kube-rbac-proxy" Apr 24 19:35:12.215613 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.215552 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba2074e-1a62-4c76-bbe3-4f68b1110ff0" containerName="kube-rbac-proxy" Apr 24 19:35:12.215613 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.215567 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cba2074e-1a62-4c76-bbe3-4f68b1110ff0" containerName="kserve-container" Apr 24 19:35:12.215613 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.215576 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba2074e-1a62-4c76-bbe3-4f68b1110ff0" containerName="kserve-container" Apr 24 19:35:12.215955 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.215652 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="cba2074e-1a62-4c76-bbe3-4f68b1110ff0" containerName="kserve-container" Apr 24 19:35:12.215955 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.215665 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="cba2074e-1a62-4c76-bbe3-4f68b1110ff0" containerName="kube-rbac-proxy" Apr 24 19:35:12.215955 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.215675 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4cde8196-8e50-4150-a320-a610187ae842" containerName="kserve-container" Apr 24 19:35:12.215955 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.215686 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4cde8196-8e50-4150-a320-a610187ae842" containerName="kube-rbac-proxy" Apr 24 19:35:12.219517 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.219422 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" Apr 24 19:35:12.222293 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.222213 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-8e956-kube-rbac-proxy-sar-config\"" Apr 24 19:35:12.222293 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.222217 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-8e956-predictor-serving-cert\"" Apr 24 19:35:12.233874 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.233851 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn"] Apr 24 19:35:12.265005 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.264978 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt"] Apr 24 19:35:12.268274 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.268258 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" Apr 24 19:35:12.270590 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.270564 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-8e956-predictor-serving-cert\"" Apr 24 19:35:12.270774 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.270567 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-8e956-kube-rbac-proxy-sar-config\"" Apr 24 19:35:12.278769 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.278747 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt"] Apr 24 19:35:12.398873 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.398788 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-8e956-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d84ecd56-c38d-41d2-9cc4-a0177cc734a1-error-404-isvc-8e956-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-8e956-predictor-579998b9d-qv6bt\" (UID: \"d84ecd56-c38d-41d2-9cc4-a0177cc734a1\") " pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" Apr 24 19:35:12.398873 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.398845 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/597f7cde-771c-4421-a1c8-c72bafa6833b-proxy-tls\") pod \"success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn\" (UID: \"597f7cde-771c-4421-a1c8-c72bafa6833b\") " pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" Apr 24 19:35:12.398873 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.398871 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d84ecd56-c38d-41d2-9cc4-a0177cc734a1-proxy-tls\") pod \"error-404-isvc-8e956-predictor-579998b9d-qv6bt\" (UID: \"d84ecd56-c38d-41d2-9cc4-a0177cc734a1\") " pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" Apr 24 19:35:12.399074 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.398969 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt66b\" (UniqueName: \"kubernetes.io/projected/597f7cde-771c-4421-a1c8-c72bafa6833b-kube-api-access-lt66b\") pod \"success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn\" (UID: \"597f7cde-771c-4421-a1c8-c72bafa6833b\") " pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" Apr 24 19:35:12.399074 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.399008 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-8e956-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/597f7cde-771c-4421-a1c8-c72bafa6833b-success-200-isvc-8e956-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn\" (UID: \"597f7cde-771c-4421-a1c8-c72bafa6833b\") " pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" Apr 24 19:35:12.399074 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.399057 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wjcg\" (UniqueName: \"kubernetes.io/projected/d84ecd56-c38d-41d2-9cc4-a0177cc734a1-kube-api-access-6wjcg\") pod \"error-404-isvc-8e956-predictor-579998b9d-qv6bt\" (UID: \"d84ecd56-c38d-41d2-9cc4-a0177cc734a1\") " pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" Apr 24 19:35:12.499514 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.499478 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lt66b\" (UniqueName: \"kubernetes.io/projected/597f7cde-771c-4421-a1c8-c72bafa6833b-kube-api-access-lt66b\") pod \"success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn\" (UID: \"597f7cde-771c-4421-a1c8-c72bafa6833b\") " pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" Apr 24 19:35:12.499700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.499534 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-8e956-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/597f7cde-771c-4421-a1c8-c72bafa6833b-success-200-isvc-8e956-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn\" (UID: \"597f7cde-771c-4421-a1c8-c72bafa6833b\") " pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" Apr 24 19:35:12.499700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.499590 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wjcg\" (UniqueName: \"kubernetes.io/projected/d84ecd56-c38d-41d2-9cc4-a0177cc734a1-kube-api-access-6wjcg\") pod \"error-404-isvc-8e956-predictor-579998b9d-qv6bt\" (UID: \"d84ecd56-c38d-41d2-9cc4-a0177cc734a1\") " pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" Apr 24 19:35:12.499700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.499620 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-8e956-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d84ecd56-c38d-41d2-9cc4-a0177cc734a1-error-404-isvc-8e956-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-8e956-predictor-579998b9d-qv6bt\" (UID: \"d84ecd56-c38d-41d2-9cc4-a0177cc734a1\") " pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" Apr 24 19:35:12.499700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.499652 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/597f7cde-771c-4421-a1c8-c72bafa6833b-proxy-tls\") pod \"success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn\" (UID: \"597f7cde-771c-4421-a1c8-c72bafa6833b\") " pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" Apr 24 19:35:12.499700 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.499677 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d84ecd56-c38d-41d2-9cc4-a0177cc734a1-proxy-tls\") pod \"error-404-isvc-8e956-predictor-579998b9d-qv6bt\" (UID: \"d84ecd56-c38d-41d2-9cc4-a0177cc734a1\") " pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" Apr 24 19:35:12.499963 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:35:12.499805 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-8e956-predictor-serving-cert: secret "error-404-isvc-8e956-predictor-serving-cert" not found Apr 24 19:35:12.499963 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:35:12.499864 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d84ecd56-c38d-41d2-9cc4-a0177cc734a1-proxy-tls podName:d84ecd56-c38d-41d2-9cc4-a0177cc734a1 nodeName:}" failed. No retries permitted until 2026-04-24 19:35:12.999843967 +0000 UTC m=+1750.117184339 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d84ecd56-c38d-41d2-9cc4-a0177cc734a1-proxy-tls") pod "error-404-isvc-8e956-predictor-579998b9d-qv6bt" (UID: "d84ecd56-c38d-41d2-9cc4-a0177cc734a1") : secret "error-404-isvc-8e956-predictor-serving-cert" not found Apr 24 19:35:12.500243 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.500216 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-8e956-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/597f7cde-771c-4421-a1c8-c72bafa6833b-success-200-isvc-8e956-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn\" (UID: \"597f7cde-771c-4421-a1c8-c72bafa6833b\") " pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" Apr 24 19:35:12.500478 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.500460 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-8e956-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d84ecd56-c38d-41d2-9cc4-a0177cc734a1-error-404-isvc-8e956-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-8e956-predictor-579998b9d-qv6bt\" (UID: \"d84ecd56-c38d-41d2-9cc4-a0177cc734a1\") " pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" Apr 24 19:35:12.500923 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.500893 2573 generic.go:358] "Generic (PLEG): container finished" podID="a6f61d88-8d78-4f16-bc8b-ec61b385dcf6" containerID="44bed989e1ed4cf40f355246ee81f588110591ce8abb00e16e9e5c008c12febf" exitCode=2 Apr 24 19:35:12.500985 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.500967 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" event={"ID":"a6f61d88-8d78-4f16-bc8b-ec61b385dcf6","Type":"ContainerDied","Data":"44bed989e1ed4cf40f355246ee81f588110591ce8abb00e16e9e5c008c12febf"} Apr 24 19:35:12.502666 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.502643 2573 generic.go:358] "Generic (PLEG): container finished" podID="dc2ae66b-d959-4e95-9f33-17fb9e2960a1" containerID="204b2f54abfc24fabf11ecd3c2d8410a40d3ae812ca0b757901b6d2b18596e50" exitCode=2 Apr 24 19:35:12.502802 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.502687 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" event={"ID":"dc2ae66b-d959-4e95-9f33-17fb9e2960a1","Type":"ContainerDied","Data":"204b2f54abfc24fabf11ecd3c2d8410a40d3ae812ca0b757901b6d2b18596e50"} Apr 24 19:35:12.502802 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.502747 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/597f7cde-771c-4421-a1c8-c72bafa6833b-proxy-tls\") pod \"success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn\" (UID: \"597f7cde-771c-4421-a1c8-c72bafa6833b\") " pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" Apr 24 19:35:12.510420 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.510400 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt66b\" (UniqueName: \"kubernetes.io/projected/597f7cde-771c-4421-a1c8-c72bafa6833b-kube-api-access-lt66b\") pod \"success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn\" (UID: \"597f7cde-771c-4421-a1c8-c72bafa6833b\") " pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" Apr 24 19:35:12.510816 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.510791 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wjcg\" (UniqueName: \"kubernetes.io/projected/d84ecd56-c38d-41d2-9cc4-a0177cc734a1-kube-api-access-6wjcg\") pod \"error-404-isvc-8e956-predictor-579998b9d-qv6bt\" (UID: \"d84ecd56-c38d-41d2-9cc4-a0177cc734a1\") " pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" Apr 24 19:35:12.530379 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.530355 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" Apr 24 19:35:12.649812 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.649765 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn"] Apr 24 19:35:12.652807 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:35:12.652778 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod597f7cde_771c_4421_a1c8_c72bafa6833b.slice/crio-8ff025d2d2e76611a6fa7e1bda0e8358e521fd4031cb3fb36fa0630ec09bd0fb WatchSource:0}: Error finding container 8ff025d2d2e76611a6fa7e1bda0e8358e521fd4031cb3fb36fa0630ec09bd0fb: Status 404 returned error can't find the container with id 8ff025d2d2e76611a6fa7e1bda0e8358e521fd4031cb3fb36fa0630ec09bd0fb Apr 24 19:35:12.655058 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:12.655040 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:35:13.004801 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:13.004777 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d84ecd56-c38d-41d2-9cc4-a0177cc734a1-proxy-tls\") pod \"error-404-isvc-8e956-predictor-579998b9d-qv6bt\" (UID: \"d84ecd56-c38d-41d2-9cc4-a0177cc734a1\") " pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" Apr 24 19:35:13.007259 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:13.007227 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d84ecd56-c38d-41d2-9cc4-a0177cc734a1-proxy-tls\") pod \"error-404-isvc-8e956-predictor-579998b9d-qv6bt\" (UID: \"d84ecd56-c38d-41d2-9cc4-a0177cc734a1\") " pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" Apr 24 19:35:13.179335 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:13.179299 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" Apr 24 19:35:13.297302 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:13.297280 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt"] Apr 24 19:35:13.299481 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:35:13.299448 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd84ecd56_c38d_41d2_9cc4_a0177cc734a1.slice/crio-a567ac9f1356c0c98b8c6475ab523cfcdff347e256aead63f17d4c57483d27a1 WatchSource:0}: Error finding container a567ac9f1356c0c98b8c6475ab523cfcdff347e256aead63f17d4c57483d27a1: Status 404 returned error can't find the container with id a567ac9f1356c0c98b8c6475ab523cfcdff347e256aead63f17d4c57483d27a1 Apr 24 19:35:13.508865 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:13.508825 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" event={"ID":"d84ecd56-c38d-41d2-9cc4-a0177cc734a1","Type":"ContainerStarted","Data":"9288459272d2870129fb18a36fe34980a46cdbb61bb676206817d12b6ee19471"} Apr 24 19:35:13.508865 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:13.508862 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" event={"ID":"d84ecd56-c38d-41d2-9cc4-a0177cc734a1","Type":"ContainerStarted","Data":"7137870406a43f2c86a329e91b354a7ca30cf45a6372f8d4971a9f32f71eeb92"} Apr 24 19:35:13.509111 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:13.508877 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" event={"ID":"d84ecd56-c38d-41d2-9cc4-a0177cc734a1","Type":"ContainerStarted","Data":"a567ac9f1356c0c98b8c6475ab523cfcdff347e256aead63f17d4c57483d27a1"} Apr 24 19:35:13.509111 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:13.508900 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" Apr 24 19:35:13.509111 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:13.509007 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" Apr 24 19:35:13.510475 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:13.510450 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" event={"ID":"597f7cde-771c-4421-a1c8-c72bafa6833b","Type":"ContainerStarted","Data":"fce2516f61dd49f3bd923607886a1f19e94ecf983215547e6ee3ed8cf15df1dd"} Apr 24 19:35:13.510475 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:13.510458 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" podUID="d84ecd56-c38d-41d2-9cc4-a0177cc734a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 19:35:13.510624 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:13.510479 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" event={"ID":"597f7cde-771c-4421-a1c8-c72bafa6833b","Type":"ContainerStarted","Data":"72757a1ba3c81d248efbe04133493ee08120940af612572fda899a79ca7c5325"} Apr 24 19:35:13.510624 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:13.510488 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" event={"ID":"597f7cde-771c-4421-a1c8-c72bafa6833b","Type":"ContainerStarted","Data":"8ff025d2d2e76611a6fa7e1bda0e8358e521fd4031cb3fb36fa0630ec09bd0fb"} Apr 24 19:35:13.510688 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:13.510645 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" Apr 24 19:35:13.510688 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:13.510666 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" Apr 24 19:35:13.511585 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:13.511565 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" podUID="597f7cde-771c-4421-a1c8-c72bafa6833b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 19:35:13.531951 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:13.531898 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" podStartSLOduration=1.5318851 podStartE2EDuration="1.5318851s" podCreationTimestamp="2026-04-24 19:35:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:35:13.53036113 +0000 UTC m=+1750.647701519" watchObservedRunningTime="2026-04-24 19:35:13.5318851 +0000 UTC m=+1750.649225491" Apr 24 19:35:13.545979 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:13.545920 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" podStartSLOduration=1.5459035989999999 podStartE2EDuration="1.545903599s" podCreationTimestamp="2026-04-24 19:35:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:35:13.54489804 +0000 UTC m=+1750.662238429" watchObservedRunningTime="2026-04-24 19:35:13.545903599 +0000 UTC m=+1750.663243989" Apr 24 19:35:14.514620 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:14.514576 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" podUID="d84ecd56-c38d-41d2-9cc4-a0177cc734a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 19:35:14.514620 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:14.514585 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" podUID="597f7cde-771c-4421-a1c8-c72bafa6833b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 19:35:14.885575 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:14.885487 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" podUID="dc2ae66b-d959-4e95-9f33-17fb9e2960a1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 24 19:35:14.889821 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:14.889800 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" podUID="dc2ae66b-d959-4e95-9f33-17fb9e2960a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 19:35:15.510878 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.510857 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" Apr 24 19:35:15.513819 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.513801 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" Apr 24 19:35:15.517630 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.517609 2573 generic.go:358] "Generic (PLEG): container finished" podID="a6f61d88-8d78-4f16-bc8b-ec61b385dcf6" containerID="d918d035f2fd0525481c6803e28bf458dd15fcc9b2ad2acd0a06e3d436b95834" exitCode=0 Apr 24 19:35:15.517936 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.517672 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" Apr 24 19:35:15.517936 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.517686 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" event={"ID":"a6f61d88-8d78-4f16-bc8b-ec61b385dcf6","Type":"ContainerDied","Data":"d918d035f2fd0525481c6803e28bf458dd15fcc9b2ad2acd0a06e3d436b95834"} Apr 24 19:35:15.517936 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.517726 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx" event={"ID":"a6f61d88-8d78-4f16-bc8b-ec61b385dcf6","Type":"ContainerDied","Data":"ca5e27026da74ecf538b6bb0c191459f699d2359b0aa0b847eee51b13301fdef"} Apr 24 19:35:15.517936 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.517747 2573 scope.go:117] "RemoveContainer" containerID="44bed989e1ed4cf40f355246ee81f588110591ce8abb00e16e9e5c008c12febf" Apr 24 19:35:15.518998 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.518939 2573 generic.go:358] "Generic (PLEG): container finished" podID="dc2ae66b-d959-4e95-9f33-17fb9e2960a1" containerID="d38f735d7591d61cfdacfed0ea62742d2c5a12970b1e613d8168aeac51ac5f50" exitCode=0 Apr 24 19:35:15.519075 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.519007 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" Apr 24 19:35:15.519075 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.519022 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" event={"ID":"dc2ae66b-d959-4e95-9f33-17fb9e2960a1","Type":"ContainerDied","Data":"d38f735d7591d61cfdacfed0ea62742d2c5a12970b1e613d8168aeac51ac5f50"} Apr 24 19:35:15.519075 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.519050 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct" event={"ID":"dc2ae66b-d959-4e95-9f33-17fb9e2960a1","Type":"ContainerDied","Data":"b26830c739d40cda8856e407b8a13fb70b6fbfd0470c2ac39bc7bd5e48d8c680"} Apr 24 19:35:15.523245 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.523224 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dmmg\" (UniqueName: \"kubernetes.io/projected/a6f61d88-8d78-4f16-bc8b-ec61b385dcf6-kube-api-access-9dmmg\") pod \"a6f61d88-8d78-4f16-bc8b-ec61b385dcf6\" (UID: \"a6f61d88-8d78-4f16-bc8b-ec61b385dcf6\") " Apr 24 19:35:15.523342 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.523271 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-ec538-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6f61d88-8d78-4f16-bc8b-ec61b385dcf6-error-404-isvc-ec538-kube-rbac-proxy-sar-config\") pod \"a6f61d88-8d78-4f16-bc8b-ec61b385dcf6\" (UID: \"a6f61d88-8d78-4f16-bc8b-ec61b385dcf6\") " Apr 24 19:35:15.523342 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.523328 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6f61d88-8d78-4f16-bc8b-ec61b385dcf6-proxy-tls\") pod \"a6f61d88-8d78-4f16-bc8b-ec61b385dcf6\" (UID: \"a6f61d88-8d78-4f16-bc8b-ec61b385dcf6\") " Apr 24 19:35:15.523475 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.523353 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc2ae66b-d959-4e95-9f33-17fb9e2960a1-proxy-tls\") pod \"dc2ae66b-d959-4e95-9f33-17fb9e2960a1\" (UID: \"dc2ae66b-d959-4e95-9f33-17fb9e2960a1\") " Apr 24 19:35:15.523475 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.523397 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-ec538-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dc2ae66b-d959-4e95-9f33-17fb9e2960a1-success-200-isvc-ec538-kube-rbac-proxy-sar-config\") pod \"dc2ae66b-d959-4e95-9f33-17fb9e2960a1\" (UID: \"dc2ae66b-d959-4e95-9f33-17fb9e2960a1\") " Apr 24 19:35:15.523578 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.523481 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27xfn\" (UniqueName: \"kubernetes.io/projected/dc2ae66b-d959-4e95-9f33-17fb9e2960a1-kube-api-access-27xfn\") pod \"dc2ae66b-d959-4e95-9f33-17fb9e2960a1\" (UID: \"dc2ae66b-d959-4e95-9f33-17fb9e2960a1\") " Apr 24 19:35:15.523694 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.523665 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6f61d88-8d78-4f16-bc8b-ec61b385dcf6-error-404-isvc-ec538-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-ec538-kube-rbac-proxy-sar-config") pod "a6f61d88-8d78-4f16-bc8b-ec61b385dcf6" (UID: "a6f61d88-8d78-4f16-bc8b-ec61b385dcf6"). InnerVolumeSpecName "error-404-isvc-ec538-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:35:15.523849 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.523825 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-ec538-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6f61d88-8d78-4f16-bc8b-ec61b385dcf6-error-404-isvc-ec538-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:35:15.523942 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.523864 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc2ae66b-d959-4e95-9f33-17fb9e2960a1-success-200-isvc-ec538-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-ec538-kube-rbac-proxy-sar-config") pod "dc2ae66b-d959-4e95-9f33-17fb9e2960a1" (UID: "dc2ae66b-d959-4e95-9f33-17fb9e2960a1"). InnerVolumeSpecName "success-200-isvc-ec538-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:35:15.525535 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.525501 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6f61d88-8d78-4f16-bc8b-ec61b385dcf6-kube-api-access-9dmmg" (OuterVolumeSpecName: "kube-api-access-9dmmg") pod "a6f61d88-8d78-4f16-bc8b-ec61b385dcf6" (UID: "a6f61d88-8d78-4f16-bc8b-ec61b385dcf6"). InnerVolumeSpecName "kube-api-access-9dmmg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:35:15.525661 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.525629 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6f61d88-8d78-4f16-bc8b-ec61b385dcf6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a6f61d88-8d78-4f16-bc8b-ec61b385dcf6" (UID: "a6f61d88-8d78-4f16-bc8b-ec61b385dcf6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:35:15.525814 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.525795 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc2ae66b-d959-4e95-9f33-17fb9e2960a1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "dc2ae66b-d959-4e95-9f33-17fb9e2960a1" (UID: "dc2ae66b-d959-4e95-9f33-17fb9e2960a1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:35:15.525948 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.525930 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc2ae66b-d959-4e95-9f33-17fb9e2960a1-kube-api-access-27xfn" (OuterVolumeSpecName: "kube-api-access-27xfn") pod "dc2ae66b-d959-4e95-9f33-17fb9e2960a1" (UID: "dc2ae66b-d959-4e95-9f33-17fb9e2960a1"). InnerVolumeSpecName "kube-api-access-27xfn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:35:15.526403 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.526389 2573 scope.go:117] "RemoveContainer" containerID="d918d035f2fd0525481c6803e28bf458dd15fcc9b2ad2acd0a06e3d436b95834" Apr 24 19:35:15.534995 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.534980 2573 scope.go:117] "RemoveContainer" containerID="44bed989e1ed4cf40f355246ee81f588110591ce8abb00e16e9e5c008c12febf" Apr 24 19:35:15.535270 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:35:15.535250 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44bed989e1ed4cf40f355246ee81f588110591ce8abb00e16e9e5c008c12febf\": container with ID starting with 44bed989e1ed4cf40f355246ee81f588110591ce8abb00e16e9e5c008c12febf not found: ID does not exist" containerID="44bed989e1ed4cf40f355246ee81f588110591ce8abb00e16e9e5c008c12febf" Apr 24 19:35:15.535357 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.535285 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44bed989e1ed4cf40f355246ee81f588110591ce8abb00e16e9e5c008c12febf"} err="failed to get container status \"44bed989e1ed4cf40f355246ee81f588110591ce8abb00e16e9e5c008c12febf\": rpc error: code = NotFound desc = could not find container \"44bed989e1ed4cf40f355246ee81f588110591ce8abb00e16e9e5c008c12febf\": container with ID starting with 44bed989e1ed4cf40f355246ee81f588110591ce8abb00e16e9e5c008c12febf not found: ID does not exist" Apr 24 19:35:15.535357 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.535308 2573 scope.go:117] "RemoveContainer" containerID="d918d035f2fd0525481c6803e28bf458dd15fcc9b2ad2acd0a06e3d436b95834" Apr 24 19:35:15.535628 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:35:15.535611 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d918d035f2fd0525481c6803e28bf458dd15fcc9b2ad2acd0a06e3d436b95834\": container with ID starting with d918d035f2fd0525481c6803e28bf458dd15fcc9b2ad2acd0a06e3d436b95834 not found: ID does not exist" containerID="d918d035f2fd0525481c6803e28bf458dd15fcc9b2ad2acd0a06e3d436b95834" Apr 24 19:35:15.535696 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.535634 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d918d035f2fd0525481c6803e28bf458dd15fcc9b2ad2acd0a06e3d436b95834"} err="failed to get container status \"d918d035f2fd0525481c6803e28bf458dd15fcc9b2ad2acd0a06e3d436b95834\": rpc error: code = NotFound desc = could not find container \"d918d035f2fd0525481c6803e28bf458dd15fcc9b2ad2acd0a06e3d436b95834\": container with ID starting with d918d035f2fd0525481c6803e28bf458dd15fcc9b2ad2acd0a06e3d436b95834 not found: ID does not exist" Apr 24 19:35:15.535696 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.535649 2573 scope.go:117] "RemoveContainer" containerID="204b2f54abfc24fabf11ecd3c2d8410a40d3ae812ca0b757901b6d2b18596e50" Apr 24 19:35:15.542607 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.542577 2573 scope.go:117] "RemoveContainer" containerID="d38f735d7591d61cfdacfed0ea62742d2c5a12970b1e613d8168aeac51ac5f50" Apr 24 19:35:15.549757 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.549599 2573 scope.go:117] "RemoveContainer" containerID="204b2f54abfc24fabf11ecd3c2d8410a40d3ae812ca0b757901b6d2b18596e50" Apr 24 19:35:15.549877 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:35:15.549854 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"204b2f54abfc24fabf11ecd3c2d8410a40d3ae812ca0b757901b6d2b18596e50\": container with ID starting with 204b2f54abfc24fabf11ecd3c2d8410a40d3ae812ca0b757901b6d2b18596e50 not found: ID does not exist" containerID="204b2f54abfc24fabf11ecd3c2d8410a40d3ae812ca0b757901b6d2b18596e50" Apr 24 19:35:15.549989 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.549884 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"204b2f54abfc24fabf11ecd3c2d8410a40d3ae812ca0b757901b6d2b18596e50"} err="failed to get container status \"204b2f54abfc24fabf11ecd3c2d8410a40d3ae812ca0b757901b6d2b18596e50\": rpc error: code = NotFound desc = could not find container \"204b2f54abfc24fabf11ecd3c2d8410a40d3ae812ca0b757901b6d2b18596e50\": container with ID starting with 204b2f54abfc24fabf11ecd3c2d8410a40d3ae812ca0b757901b6d2b18596e50 not found: ID does not exist" Apr 24 19:35:15.549989 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.549908 2573 scope.go:117] "RemoveContainer" containerID="d38f735d7591d61cfdacfed0ea62742d2c5a12970b1e613d8168aeac51ac5f50" Apr 24 19:35:15.550172 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:35:15.550147 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d38f735d7591d61cfdacfed0ea62742d2c5a12970b1e613d8168aeac51ac5f50\": container with ID starting with d38f735d7591d61cfdacfed0ea62742d2c5a12970b1e613d8168aeac51ac5f50 not found: ID does not exist" containerID="d38f735d7591d61cfdacfed0ea62742d2c5a12970b1e613d8168aeac51ac5f50" Apr 24 19:35:15.550244 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.550178 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d38f735d7591d61cfdacfed0ea62742d2c5a12970b1e613d8168aeac51ac5f50"} err="failed to get container status \"d38f735d7591d61cfdacfed0ea62742d2c5a12970b1e613d8168aeac51ac5f50\": rpc error: code = NotFound desc = could not find container \"d38f735d7591d61cfdacfed0ea62742d2c5a12970b1e613d8168aeac51ac5f50\": container with ID starting with d38f735d7591d61cfdacfed0ea62742d2c5a12970b1e613d8168aeac51ac5f50 not found: ID does not exist" Apr 24 19:35:15.624303 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.624272 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6f61d88-8d78-4f16-bc8b-ec61b385dcf6-proxy-tls\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:35:15.624303 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.624298 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc2ae66b-d959-4e95-9f33-17fb9e2960a1-proxy-tls\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:35:15.624303 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.624308 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-ec538-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dc2ae66b-d959-4e95-9f33-17fb9e2960a1-success-200-isvc-ec538-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:35:15.624542 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.624321 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-27xfn\" (UniqueName: \"kubernetes.io/projected/dc2ae66b-d959-4e95-9f33-17fb9e2960a1-kube-api-access-27xfn\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:35:15.624542 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.624331 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9dmmg\" (UniqueName: \"kubernetes.io/projected/a6f61d88-8d78-4f16-bc8b-ec61b385dcf6-kube-api-access-9dmmg\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:35:15.841240 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.841215 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx"] Apr 24 19:35:15.845080 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.845056 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ec538-predictor-5cc86fcf96-24dwx"] Apr 24 19:35:15.855326 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.855300 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct"] Apr 24 19:35:15.860409 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:15.860389 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ec538-predictor-767f7c7f5-ljrct"] Apr 24 19:35:17.380514 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:17.380480 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6f61d88-8d78-4f16-bc8b-ec61b385dcf6" path="/var/lib/kubelet/pods/a6f61d88-8d78-4f16-bc8b-ec61b385dcf6/volumes" Apr 24 19:35:17.380977 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:17.380958 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc2ae66b-d959-4e95-9f33-17fb9e2960a1" path="/var/lib/kubelet/pods/dc2ae66b-d959-4e95-9f33-17fb9e2960a1/volumes" Apr 24 19:35:19.521002 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:19.520675 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" Apr 24 19:35:19.521468 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:19.521145 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" podUID="d84ecd56-c38d-41d2-9cc4-a0177cc734a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 19:35:19.521468 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:19.521411 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" Apr 24 19:35:19.521819 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:19.521782 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" podUID="597f7cde-771c-4421-a1c8-c72bafa6833b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 19:35:29.521669 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:29.521628 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" podUID="d84ecd56-c38d-41d2-9cc4-a0177cc734a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 19:35:29.522090 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:29.521880 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" podUID="597f7cde-771c-4421-a1c8-c72bafa6833b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 19:35:39.521194 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:39.521154 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" podUID="d84ecd56-c38d-41d2-9cc4-a0177cc734a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 19:35:39.522377 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:39.522352 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" podUID="597f7cde-771c-4421-a1c8-c72bafa6833b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 19:35:41.856618 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.856587 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd"] Apr 24 19:35:41.857002 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.856855 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" podUID="6d5cedad-0251-4e89-96b5-e205fbec161d" containerName="kserve-container" containerID="cri-o://bbfbc6a03d02ec73b1c15200889b86042dab8dda0b70065e6a96bff38afbb7a9" gracePeriod=30 Apr 24 19:35:41.857002 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.856927 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" podUID="6d5cedad-0251-4e89-96b5-e205fbec161d" containerName="kube-rbac-proxy" containerID="cri-o://55ebcb48a91019e34e6918e115fb90195c5ec7af9cc745ee5c296ee0b4aa5864" gracePeriod=30 Apr 24 19:35:41.894823 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.894366 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj"] Apr 24 19:35:41.894823 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.894759 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc2ae66b-d959-4e95-9f33-17fb9e2960a1" containerName="kube-rbac-proxy" Apr 24 19:35:41.894823 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.894774 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc2ae66b-d959-4e95-9f33-17fb9e2960a1" containerName="kube-rbac-proxy" Apr 24 19:35:41.894823 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.894795 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6f61d88-8d78-4f16-bc8b-ec61b385dcf6" containerName="kube-rbac-proxy" Apr 24 19:35:41.894823 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.894801 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f61d88-8d78-4f16-bc8b-ec61b385dcf6" containerName="kube-rbac-proxy" Apr 24 19:35:41.894823 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.894808 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6f61d88-8d78-4f16-bc8b-ec61b385dcf6" containerName="kserve-container" Apr 24 19:35:41.894823 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.894813 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f61d88-8d78-4f16-bc8b-ec61b385dcf6" containerName="kserve-container" Apr 24 19:35:41.894823 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.894821 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc2ae66b-d959-4e95-9f33-17fb9e2960a1" containerName="kserve-container" Apr 24 19:35:41.894823 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.894826 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc2ae66b-d959-4e95-9f33-17fb9e2960a1" containerName="kserve-container" Apr 24 19:35:41.895385 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.894879 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6f61d88-8d78-4f16-bc8b-ec61b385dcf6" containerName="kserve-container" Apr 24 19:35:41.895385 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.894891 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="dc2ae66b-d959-4e95-9f33-17fb9e2960a1" containerName="kube-rbac-proxy" Apr 24 19:35:41.895385 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.894897 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="dc2ae66b-d959-4e95-9f33-17fb9e2960a1" containerName="kserve-container" Apr 24 19:35:41.895385 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.894903 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6f61d88-8d78-4f16-bc8b-ec61b385dcf6" containerName="kube-rbac-proxy" Apr 24 19:35:41.898266 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.898241 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" Apr 24 19:35:41.900801 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.900660 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-a0351-predictor-serving-cert\"" Apr 24 19:35:41.900801 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.900676 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-a0351-kube-rbac-proxy-sar-config\"" Apr 24 19:35:41.907625 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.907600 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj"] Apr 24 19:35:41.928895 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.928868 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7"] Apr 24 19:35:41.929185 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.929152 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" podUID="2fcf9162-136e-4823-b6a2-285c935c7f14" containerName="kserve-container" containerID="cri-o://e80df61a0759f1f1e206bd1e2a74c4afb87268d73203d6d50d53947c8fa59ee4" gracePeriod=30 Apr 24 19:35:41.929185 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.929176 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" podUID="2fcf9162-136e-4823-b6a2-285c935c7f14" containerName="kube-rbac-proxy" containerID="cri-o://5ff9812c45d24feca51ab0b43b0f0e496cb8d83958be478905cb3ea80b83eb9e" gracePeriod=30 Apr 24 19:35:41.983423 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.983390 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc"] Apr 24 19:35:41.986797 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.986774 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" Apr 24 19:35:41.989296 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.989269 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-a0351-predictor-serving-cert\"" Apr 24 19:35:41.989427 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.989302 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-a0351-kube-rbac-proxy-sar-config\"" Apr 24 19:35:41.996209 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:41.996180 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc"] Apr 24 19:35:42.018708 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.018666 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptp54\" (UniqueName: \"kubernetes.io/projected/39c2c5c2-4106-44f1-9132-cc5e3065e04a-kube-api-access-ptp54\") pod \"success-200-isvc-a0351-predictor-575fcb594c-rxptj\" (UID: \"39c2c5c2-4106-44f1-9132-cc5e3065e04a\") " pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" Apr 24 19:35:42.018825 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.018728 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-a0351-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/39c2c5c2-4106-44f1-9132-cc5e3065e04a-success-200-isvc-a0351-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a0351-predictor-575fcb594c-rxptj\" (UID: \"39c2c5c2-4106-44f1-9132-cc5e3065e04a\") " pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" Apr 24 19:35:42.018825 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.018769 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44w65\" (UniqueName: \"kubernetes.io/projected/76541b04-b9fb-4b4d-b9c6-e9487f3a1b80-kube-api-access-44w65\") pod \"error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc\" (UID: \"76541b04-b9fb-4b4d-b9c6-e9487f3a1b80\") " pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" Apr 24 19:35:42.018913 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.018831 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-a0351-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/76541b04-b9fb-4b4d-b9c6-e9487f3a1b80-error-404-isvc-a0351-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc\" (UID: \"76541b04-b9fb-4b4d-b9c6-e9487f3a1b80\") " pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" Apr 24 19:35:42.018913 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.018886 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/39c2c5c2-4106-44f1-9132-cc5e3065e04a-proxy-tls\") pod \"success-200-isvc-a0351-predictor-575fcb594c-rxptj\" (UID: \"39c2c5c2-4106-44f1-9132-cc5e3065e04a\") " pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" Apr 24 19:35:42.018913 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.018902 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76541b04-b9fb-4b4d-b9c6-e9487f3a1b80-proxy-tls\") pod \"error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc\" (UID: \"76541b04-b9fb-4b4d-b9c6-e9487f3a1b80\") " pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" Apr 24 19:35:42.120155 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.120075 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44w65\" (UniqueName: \"kubernetes.io/projected/76541b04-b9fb-4b4d-b9c6-e9487f3a1b80-kube-api-access-44w65\") pod \"error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc\" (UID: \"76541b04-b9fb-4b4d-b9c6-e9487f3a1b80\") " pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" Apr 24 19:35:42.120155 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.120115 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-a0351-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/76541b04-b9fb-4b4d-b9c6-e9487f3a1b80-error-404-isvc-a0351-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc\" (UID: \"76541b04-b9fb-4b4d-b9c6-e9487f3a1b80\") " pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" Apr 24 19:35:42.120355 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.120157 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/39c2c5c2-4106-44f1-9132-cc5e3065e04a-proxy-tls\") pod \"success-200-isvc-a0351-predictor-575fcb594c-rxptj\" (UID: \"39c2c5c2-4106-44f1-9132-cc5e3065e04a\") " pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" Apr 24 19:35:42.120355 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.120285 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76541b04-b9fb-4b4d-b9c6-e9487f3a1b80-proxy-tls\") pod \"error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc\" (UID: \"76541b04-b9fb-4b4d-b9c6-e9487f3a1b80\") " pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" Apr 24 19:35:42.120512 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.120365 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptp54\" (UniqueName: \"kubernetes.io/projected/39c2c5c2-4106-44f1-9132-cc5e3065e04a-kube-api-access-ptp54\") pod \"success-200-isvc-a0351-predictor-575fcb594c-rxptj\" (UID: \"39c2c5c2-4106-44f1-9132-cc5e3065e04a\") " pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" Apr 24 19:35:42.120512 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.120420 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-a0351-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/39c2c5c2-4106-44f1-9132-cc5e3065e04a-success-200-isvc-a0351-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a0351-predictor-575fcb594c-rxptj\" (UID: \"39c2c5c2-4106-44f1-9132-cc5e3065e04a\") " pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" Apr 24 19:35:42.120512 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:35:42.120479 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-a0351-predictor-serving-cert: secret "error-404-isvc-a0351-predictor-serving-cert" not found Apr 24 19:35:42.120683 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:35:42.120566 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76541b04-b9fb-4b4d-b9c6-e9487f3a1b80-proxy-tls podName:76541b04-b9fb-4b4d-b9c6-e9487f3a1b80 nodeName:}" failed. No retries permitted until 2026-04-24 19:35:42.620543015 +0000 UTC m=+1779.737883381 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/76541b04-b9fb-4b4d-b9c6-e9487f3a1b80-proxy-tls") pod "error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" (UID: "76541b04-b9fb-4b4d-b9c6-e9487f3a1b80") : secret "error-404-isvc-a0351-predictor-serving-cert" not found Apr 24 19:35:42.120886 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.120866 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-a0351-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/76541b04-b9fb-4b4d-b9c6-e9487f3a1b80-error-404-isvc-a0351-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc\" (UID: \"76541b04-b9fb-4b4d-b9c6-e9487f3a1b80\") " pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" Apr 24 19:35:42.121110 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.121090 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-a0351-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/39c2c5c2-4106-44f1-9132-cc5e3065e04a-success-200-isvc-a0351-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a0351-predictor-575fcb594c-rxptj\" (UID: \"39c2c5c2-4106-44f1-9132-cc5e3065e04a\") " pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" Apr 24 19:35:42.122654 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.122635 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/39c2c5c2-4106-44f1-9132-cc5e3065e04a-proxy-tls\") pod \"success-200-isvc-a0351-predictor-575fcb594c-rxptj\" (UID: \"39c2c5c2-4106-44f1-9132-cc5e3065e04a\") " pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" Apr 24 19:35:42.131238 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.131214 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptp54\" (UniqueName: \"kubernetes.io/projected/39c2c5c2-4106-44f1-9132-cc5e3065e04a-kube-api-access-ptp54\") pod \"success-200-isvc-a0351-predictor-575fcb594c-rxptj\" (UID: \"39c2c5c2-4106-44f1-9132-cc5e3065e04a\") " pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" Apr 24 19:35:42.131238 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.131230 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44w65\" (UniqueName: \"kubernetes.io/projected/76541b04-b9fb-4b4d-b9c6-e9487f3a1b80-kube-api-access-44w65\") pod \"error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc\" (UID: \"76541b04-b9fb-4b4d-b9c6-e9487f3a1b80\") " pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" Apr 24 19:35:42.212930 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.212898 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" Apr 24 19:35:42.343730 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.343700 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj"] Apr 24 19:35:42.347067 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:35:42.347036 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39c2c5c2_4106_44f1_9132_cc5e3065e04a.slice/crio-182a28698a7c976b89e7bff0e20c88636aeeee964a02a2aade81c47b6f04558e WatchSource:0}: Error finding container 182a28698a7c976b89e7bff0e20c88636aeeee964a02a2aade81c47b6f04558e: Status 404 returned error can't find the container with id 182a28698a7c976b89e7bff0e20c88636aeeee964a02a2aade81c47b6f04558e Apr 24 19:35:42.602664 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.602625 2573 generic.go:358] "Generic (PLEG): container finished" podID="2fcf9162-136e-4823-b6a2-285c935c7f14" containerID="5ff9812c45d24feca51ab0b43b0f0e496cb8d83958be478905cb3ea80b83eb9e" exitCode=2 Apr 24 19:35:42.602852 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.602689 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" event={"ID":"2fcf9162-136e-4823-b6a2-285c935c7f14","Type":"ContainerDied","Data":"5ff9812c45d24feca51ab0b43b0f0e496cb8d83958be478905cb3ea80b83eb9e"} Apr 24 19:35:42.604209 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.604181 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" event={"ID":"39c2c5c2-4106-44f1-9132-cc5e3065e04a","Type":"ContainerStarted","Data":"a8b1f5d1584ccbd6ebc269350b7ad886a7f94458f95090e247dedbec478cdb3c"} Apr 24 19:35:42.604209 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.604213 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" event={"ID":"39c2c5c2-4106-44f1-9132-cc5e3065e04a","Type":"ContainerStarted","Data":"1423ef703815f3f51d3cf071f589247cfb4bd33a6db5eba2397e17cf9342bd00"} Apr 24 19:35:42.604416 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.604228 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" Apr 24 19:35:42.604416 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.604242 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" event={"ID":"39c2c5c2-4106-44f1-9132-cc5e3065e04a","Type":"ContainerStarted","Data":"182a28698a7c976b89e7bff0e20c88636aeeee964a02a2aade81c47b6f04558e"} Apr 24 19:35:42.604416 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.604256 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" Apr 24 19:35:42.605747 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.605724 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" podUID="39c2c5c2-4106-44f1-9132-cc5e3065e04a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 24 19:35:42.605848 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.605794 2573 generic.go:358] "Generic (PLEG): container finished" podID="6d5cedad-0251-4e89-96b5-e205fbec161d" containerID="55ebcb48a91019e34e6918e115fb90195c5ec7af9cc745ee5c296ee0b4aa5864" exitCode=2 Apr 24 19:35:42.605848 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.605824 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" event={"ID":"6d5cedad-0251-4e89-96b5-e205fbec161d","Type":"ContainerDied","Data":"55ebcb48a91019e34e6918e115fb90195c5ec7af9cc745ee5c296ee0b4aa5864"} Apr 24 19:35:42.622768 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.622730 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" podStartSLOduration=1.6227166450000001 podStartE2EDuration="1.622716645s" podCreationTimestamp="2026-04-24 19:35:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:35:42.620805295 +0000 UTC m=+1779.738145684" watchObservedRunningTime="2026-04-24 19:35:42.622716645 +0000 UTC m=+1779.740057030" Apr 24 19:35:42.625373 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.625342 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76541b04-b9fb-4b4d-b9c6-e9487f3a1b80-proxy-tls\") pod \"error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc\" (UID: \"76541b04-b9fb-4b4d-b9c6-e9487f3a1b80\") " pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" Apr 24 19:35:42.627872 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.627849 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76541b04-b9fb-4b4d-b9c6-e9487f3a1b80-proxy-tls\") pod \"error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc\" (UID: \"76541b04-b9fb-4b4d-b9c6-e9487f3a1b80\") " pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" Apr 24 19:35:42.898015 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:42.897981 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" Apr 24 19:35:43.020378 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:43.020354 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc"] Apr 24 19:35:43.022370 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:35:43.022339 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76541b04_b9fb_4b4d_b9c6_e9487f3a1b80.slice/crio-3e0de82566acdd2e9c3fbfd726ce203a78f1c57af3856009c727c236bf4396fe WatchSource:0}: Error finding container 3e0de82566acdd2e9c3fbfd726ce203a78f1c57af3856009c727c236bf4396fe: Status 404 returned error can't find the container with id 3e0de82566acdd2e9c3fbfd726ce203a78f1c57af3856009c727c236bf4396fe Apr 24 19:35:43.611140 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:43.611098 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" event={"ID":"76541b04-b9fb-4b4d-b9c6-e9487f3a1b80","Type":"ContainerStarted","Data":"357efcd04511f0ef09f6c6d898862eb9aa6be58c8ad5efb670de7370dd287401"} Apr 24 19:35:43.611140 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:43.611142 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" event={"ID":"76541b04-b9fb-4b4d-b9c6-e9487f3a1b80","Type":"ContainerStarted","Data":"da9331c428514f84e336cd605a2abb673834364cb157a3cde45107a6f3049fb0"} Apr 24 19:35:43.611399 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:43.611160 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" event={"ID":"76541b04-b9fb-4b4d-b9c6-e9487f3a1b80","Type":"ContainerStarted","Data":"3e0de82566acdd2e9c3fbfd726ce203a78f1c57af3856009c727c236bf4396fe"} Apr 24 19:35:43.611492 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:43.611422 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" Apr 24 19:35:43.611492 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:43.611480 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" Apr 24 19:35:43.611645 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:43.611623 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" podUID="39c2c5c2-4106-44f1-9132-cc5e3065e04a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 24 19:35:43.612497 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:43.612478 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" podUID="76541b04-b9fb-4b4d-b9c6-e9487f3a1b80" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 24 19:35:43.629321 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:43.629282 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" podStartSLOduration=2.629271238 podStartE2EDuration="2.629271238s" podCreationTimestamp="2026-04-24 19:35:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:35:43.627706227 +0000 UTC m=+1780.745046616" watchObservedRunningTime="2026-04-24 19:35:43.629271238 +0000 UTC m=+1780.746611698" Apr 24 19:35:44.615390 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:44.615354 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" podUID="76541b04-b9fb-4b4d-b9c6-e9487f3a1b80" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 24 19:35:45.019618 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:45.019571 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" podUID="2fcf9162-136e-4823-b6a2-285c935c7f14" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.35:8643/healthz\": dial tcp 10.133.0.35:8643: connect: connection refused" Apr 24 19:35:45.023907 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:45.023882 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" podUID="2fcf9162-136e-4823-b6a2-285c935c7f14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 19:35:45.620506 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:45.620478 2573 generic.go:358] "Generic (PLEG): container finished" podID="6d5cedad-0251-4e89-96b5-e205fbec161d" containerID="bbfbc6a03d02ec73b1c15200889b86042dab8dda0b70065e6a96bff38afbb7a9" exitCode=0 Apr 24 19:35:45.620808 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:45.620547 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" event={"ID":"6d5cedad-0251-4e89-96b5-e205fbec161d","Type":"ContainerDied","Data":"bbfbc6a03d02ec73b1c15200889b86042dab8dda0b70065e6a96bff38afbb7a9"} Apr 24 19:35:45.706465 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:45.706417 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" Apr 24 19:35:45.752825 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:45.752804 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5lz5\" (UniqueName: \"kubernetes.io/projected/6d5cedad-0251-4e89-96b5-e205fbec161d-kube-api-access-m5lz5\") pod \"6d5cedad-0251-4e89-96b5-e205fbec161d\" (UID: \"6d5cedad-0251-4e89-96b5-e205fbec161d\") " Apr 24 19:35:45.752960 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:45.752849 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d5cedad-0251-4e89-96b5-e205fbec161d-proxy-tls\") pod \"6d5cedad-0251-4e89-96b5-e205fbec161d\" (UID: \"6d5cedad-0251-4e89-96b5-e205fbec161d\") " Apr 24 19:35:45.752960 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:45.752899 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-cb3b2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6d5cedad-0251-4e89-96b5-e205fbec161d-success-200-isvc-cb3b2-kube-rbac-proxy-sar-config\") pod \"6d5cedad-0251-4e89-96b5-e205fbec161d\" (UID: \"6d5cedad-0251-4e89-96b5-e205fbec161d\") " Apr 24 19:35:45.753309 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:45.753282 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d5cedad-0251-4e89-96b5-e205fbec161d-success-200-isvc-cb3b2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-cb3b2-kube-rbac-proxy-sar-config") pod "6d5cedad-0251-4e89-96b5-e205fbec161d" (UID: "6d5cedad-0251-4e89-96b5-e205fbec161d"). InnerVolumeSpecName "success-200-isvc-cb3b2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:35:45.755086 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:45.755058 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d5cedad-0251-4e89-96b5-e205fbec161d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6d5cedad-0251-4e89-96b5-e205fbec161d" (UID: "6d5cedad-0251-4e89-96b5-e205fbec161d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:35:45.755086 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:45.755065 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d5cedad-0251-4e89-96b5-e205fbec161d-kube-api-access-m5lz5" (OuterVolumeSpecName: "kube-api-access-m5lz5") pod "6d5cedad-0251-4e89-96b5-e205fbec161d" (UID: "6d5cedad-0251-4e89-96b5-e205fbec161d"). InnerVolumeSpecName "kube-api-access-m5lz5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:35:45.853854 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:45.853824 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-cb3b2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6d5cedad-0251-4e89-96b5-e205fbec161d-success-200-isvc-cb3b2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:35:45.853854 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:45.853858 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m5lz5\" (UniqueName: \"kubernetes.io/projected/6d5cedad-0251-4e89-96b5-e205fbec161d-kube-api-access-m5lz5\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:35:45.854039 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:45.853872 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d5cedad-0251-4e89-96b5-e205fbec161d-proxy-tls\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:35:46.625230 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:46.625191 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" event={"ID":"6d5cedad-0251-4e89-96b5-e205fbec161d","Type":"ContainerDied","Data":"59289d453d16b67e723ffbad957213e81b5f4f7c0ec556eae17b452f9b21e77f"} Apr 24 19:35:46.625693 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:46.625238 2573 scope.go:117] "RemoveContainer" containerID="55ebcb48a91019e34e6918e115fb90195c5ec7af9cc745ee5c296ee0b4aa5864" Apr 24 19:35:46.625693 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:46.625285 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd" Apr 24 19:35:46.633975 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:46.633957 2573 scope.go:117] "RemoveContainer" containerID="bbfbc6a03d02ec73b1c15200889b86042dab8dda0b70065e6a96bff38afbb7a9" Apr 24 19:35:46.645750 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:46.645729 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd"] Apr 24 19:35:46.651591 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:46.651571 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cb3b2-predictor-7cb6bb55c9-qkmvd"] Apr 24 19:35:47.379581 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:47.379552 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d5cedad-0251-4e89-96b5-e205fbec161d" path="/var/lib/kubelet/pods/6d5cedad-0251-4e89-96b5-e205fbec161d/volumes" Apr 24 19:35:48.615773 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:48.615690 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" Apr 24 19:35:48.616156 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:48.616132 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" podUID="39c2c5c2-4106-44f1-9132-cc5e3065e04a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 24 19:35:48.976027 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:48.976002 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" Apr 24 19:35:49.083325 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.083290 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s7b8\" (UniqueName: \"kubernetes.io/projected/2fcf9162-136e-4823-b6a2-285c935c7f14-kube-api-access-8s7b8\") pod \"2fcf9162-136e-4823-b6a2-285c935c7f14\" (UID: \"2fcf9162-136e-4823-b6a2-285c935c7f14\") " Apr 24 19:35:49.083528 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.083342 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fcf9162-136e-4823-b6a2-285c935c7f14-proxy-tls\") pod \"2fcf9162-136e-4823-b6a2-285c935c7f14\" (UID: \"2fcf9162-136e-4823-b6a2-285c935c7f14\") " Apr 24 19:35:49.083528 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.083475 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-cb3b2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fcf9162-136e-4823-b6a2-285c935c7f14-error-404-isvc-cb3b2-kube-rbac-proxy-sar-config\") pod \"2fcf9162-136e-4823-b6a2-285c935c7f14\" (UID: \"2fcf9162-136e-4823-b6a2-285c935c7f14\") " Apr 24 19:35:49.083829 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.083803 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fcf9162-136e-4823-b6a2-285c935c7f14-error-404-isvc-cb3b2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-cb3b2-kube-rbac-proxy-sar-config") pod "2fcf9162-136e-4823-b6a2-285c935c7f14" (UID: "2fcf9162-136e-4823-b6a2-285c935c7f14"). InnerVolumeSpecName "error-404-isvc-cb3b2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:35:49.085538 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.085502 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fcf9162-136e-4823-b6a2-285c935c7f14-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2fcf9162-136e-4823-b6a2-285c935c7f14" (UID: "2fcf9162-136e-4823-b6a2-285c935c7f14"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:35:49.085638 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.085621 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fcf9162-136e-4823-b6a2-285c935c7f14-kube-api-access-8s7b8" (OuterVolumeSpecName: "kube-api-access-8s7b8") pod "2fcf9162-136e-4823-b6a2-285c935c7f14" (UID: "2fcf9162-136e-4823-b6a2-285c935c7f14"). InnerVolumeSpecName "kube-api-access-8s7b8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:35:49.184688 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.184605 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8s7b8\" (UniqueName: \"kubernetes.io/projected/2fcf9162-136e-4823-b6a2-285c935c7f14-kube-api-access-8s7b8\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:35:49.184688 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.184636 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fcf9162-136e-4823-b6a2-285c935c7f14-proxy-tls\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:35:49.184688 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.184648 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-cb3b2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fcf9162-136e-4823-b6a2-285c935c7f14-error-404-isvc-cb3b2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:35:49.521511 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.521471 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" podUID="d84ecd56-c38d-41d2-9cc4-a0177cc734a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 19:35:49.521921 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.521891 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" podUID="597f7cde-771c-4421-a1c8-c72bafa6833b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 19:35:49.619469 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.619421 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" Apr 24 19:35:49.619943 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.619918 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" podUID="76541b04-b9fb-4b4d-b9c6-e9487f3a1b80" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 24 19:35:49.637894 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.637636 2573 generic.go:358] "Generic (PLEG): container finished" podID="2fcf9162-136e-4823-b6a2-285c935c7f14" containerID="e80df61a0759f1f1e206bd1e2a74c4afb87268d73203d6d50d53947c8fa59ee4" exitCode=0 Apr 24 19:35:49.637894 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.637715 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" event={"ID":"2fcf9162-136e-4823-b6a2-285c935c7f14","Type":"ContainerDied","Data":"e80df61a0759f1f1e206bd1e2a74c4afb87268d73203d6d50d53947c8fa59ee4"} Apr 24 19:35:49.637894 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.637746 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" event={"ID":"2fcf9162-136e-4823-b6a2-285c935c7f14","Type":"ContainerDied","Data":"e494371747a7c216491e448be261ff5a9fda397d47b249169680ddf7eb968b5b"} Apr 24 19:35:49.637894 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.637764 2573 scope.go:117] "RemoveContainer" containerID="5ff9812c45d24feca51ab0b43b0f0e496cb8d83958be478905cb3ea80b83eb9e" Apr 24 19:35:49.637894 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.637770 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7" Apr 24 19:35:49.646034 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.646013 2573 scope.go:117] "RemoveContainer" containerID="e80df61a0759f1f1e206bd1e2a74c4afb87268d73203d6d50d53947c8fa59ee4" Apr 24 19:35:49.652911 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.652883 2573 scope.go:117] "RemoveContainer" containerID="5ff9812c45d24feca51ab0b43b0f0e496cb8d83958be478905cb3ea80b83eb9e" Apr 24 19:35:49.653153 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:35:49.653132 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ff9812c45d24feca51ab0b43b0f0e496cb8d83958be478905cb3ea80b83eb9e\": container with ID starting with 5ff9812c45d24feca51ab0b43b0f0e496cb8d83958be478905cb3ea80b83eb9e not found: ID does not exist" containerID="5ff9812c45d24feca51ab0b43b0f0e496cb8d83958be478905cb3ea80b83eb9e" Apr 24 19:35:49.653226 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.653165 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff9812c45d24feca51ab0b43b0f0e496cb8d83958be478905cb3ea80b83eb9e"} err="failed to get container status \"5ff9812c45d24feca51ab0b43b0f0e496cb8d83958be478905cb3ea80b83eb9e\": rpc error: code = NotFound desc = could not find container \"5ff9812c45d24feca51ab0b43b0f0e496cb8d83958be478905cb3ea80b83eb9e\": container with ID starting with 5ff9812c45d24feca51ab0b43b0f0e496cb8d83958be478905cb3ea80b83eb9e not found: ID does not exist" Apr 24 19:35:49.653226 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.653190 2573 scope.go:117] "RemoveContainer" containerID="e80df61a0759f1f1e206bd1e2a74c4afb87268d73203d6d50d53947c8fa59ee4" Apr 24 19:35:49.653491 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:35:49.653472 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e80df61a0759f1f1e206bd1e2a74c4afb87268d73203d6d50d53947c8fa59ee4\": container with ID starting with e80df61a0759f1f1e206bd1e2a74c4afb87268d73203d6d50d53947c8fa59ee4 not found: ID does not exist" containerID="e80df61a0759f1f1e206bd1e2a74c4afb87268d73203d6d50d53947c8fa59ee4" Apr 24 19:35:49.653563 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.653497 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e80df61a0759f1f1e206bd1e2a74c4afb87268d73203d6d50d53947c8fa59ee4"} err="failed to get container status \"e80df61a0759f1f1e206bd1e2a74c4afb87268d73203d6d50d53947c8fa59ee4\": rpc error: code = NotFound desc = could not find container \"e80df61a0759f1f1e206bd1e2a74c4afb87268d73203d6d50d53947c8fa59ee4\": container with ID starting with e80df61a0759f1f1e206bd1e2a74c4afb87268d73203d6d50d53947c8fa59ee4 not found: ID does not exist" Apr 24 19:35:49.655191 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.655168 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7"] Apr 24 19:35:49.656477 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:49.656456 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cb3b2-predictor-754d94dcdd-qgrb7"] Apr 24 19:35:51.379723 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:51.379692 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fcf9162-136e-4823-b6a2-285c935c7f14" path="/var/lib/kubelet/pods/2fcf9162-136e-4823-b6a2-285c935c7f14/volumes" Apr 24 19:35:58.616322 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:58.616288 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" podUID="39c2c5c2-4106-44f1-9132-cc5e3065e04a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 24 19:35:59.522095 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:59.522058 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" Apr 24 19:35:59.522571 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:59.522550 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" Apr 24 19:35:59.620325 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:35:59.620293 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" podUID="76541b04-b9fb-4b4d-b9c6-e9487f3a1b80" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 24 19:36:08.616292 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:08.616246 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" podUID="39c2c5c2-4106-44f1-9132-cc5e3065e04a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 24 19:36:09.619959 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:09.619921 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" podUID="76541b04-b9fb-4b4d-b9c6-e9487f3a1b80" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 24 19:36:18.616617 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:18.616570 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" podUID="39c2c5c2-4106-44f1-9132-cc5e3065e04a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 24 19:36:19.620351 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:19.620312 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" podUID="76541b04-b9fb-4b4d-b9c6-e9487f3a1b80" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 24 19:36:22.417295 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.417212 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn"] Apr 24 19:36:22.417716 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.417518 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" podUID="597f7cde-771c-4421-a1c8-c72bafa6833b" containerName="kserve-container" containerID="cri-o://72757a1ba3c81d248efbe04133493ee08120940af612572fda899a79ca7c5325" gracePeriod=30 Apr 24 19:36:22.417716 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.417567 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" podUID="597f7cde-771c-4421-a1c8-c72bafa6833b" containerName="kube-rbac-proxy" containerID="cri-o://fce2516f61dd49f3bd923607886a1f19e94ecf983215547e6ee3ed8cf15df1dd" gracePeriod=30 Apr 24 19:36:22.457346 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.457315 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm"] Apr 24 19:36:22.457660 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.457646 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d5cedad-0251-4e89-96b5-e205fbec161d" containerName="kserve-container" Apr 24 19:36:22.457713 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.457662 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d5cedad-0251-4e89-96b5-e205fbec161d" containerName="kserve-container" Apr 24 19:36:22.457713 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.457671 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fcf9162-136e-4823-b6a2-285c935c7f14" containerName="kserve-container" Apr 24 19:36:22.457713 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.457677 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fcf9162-136e-4823-b6a2-285c935c7f14" containerName="kserve-container" Apr 24 19:36:22.457713 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.457685 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d5cedad-0251-4e89-96b5-e205fbec161d" containerName="kube-rbac-proxy" Apr 24 19:36:22.457713 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.457690 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d5cedad-0251-4e89-96b5-e205fbec161d" containerName="kube-rbac-proxy" Apr 24 19:36:22.457713 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.457702 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fcf9162-136e-4823-b6a2-285c935c7f14" containerName="kube-rbac-proxy" Apr 24 19:36:22.457713 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.457707 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fcf9162-136e-4823-b6a2-285c935c7f14" containerName="kube-rbac-proxy" Apr 24 19:36:22.457918 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.457756 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2fcf9162-136e-4823-b6a2-285c935c7f14" containerName="kserve-container" Apr 24 19:36:22.457918 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.457765 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6d5cedad-0251-4e89-96b5-e205fbec161d" containerName="kube-rbac-proxy" Apr 24 19:36:22.457918 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.457772 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2fcf9162-136e-4823-b6a2-285c935c7f14" containerName="kube-rbac-proxy" Apr 24 19:36:22.457918 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.457779 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6d5cedad-0251-4e89-96b5-e205fbec161d" containerName="kserve-container" Apr 24 19:36:22.466976 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.466948 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" Apr 24 19:36:22.470522 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.470483 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm"] Apr 24 19:36:22.472062 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.471290 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-11f3c-predictor-serving-cert\"" Apr 24 19:36:22.472062 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.471558 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-11f3c-kube-rbac-proxy-sar-config\"" Apr 24 19:36:22.483908 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.483885 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt"] Apr 24 19:36:22.484184 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.484146 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" podUID="d84ecd56-c38d-41d2-9cc4-a0177cc734a1" containerName="kserve-container" containerID="cri-o://7137870406a43f2c86a329e91b354a7ca30cf45a6372f8d4971a9f32f71eeb92" gracePeriod=30 Apr 24 19:36:22.484262 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.484221 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" podUID="d84ecd56-c38d-41d2-9cc4-a0177cc734a1" containerName="kube-rbac-proxy" containerID="cri-o://9288459272d2870129fb18a36fe34980a46cdbb61bb676206817d12b6ee19471" gracePeriod=30 Apr 24 19:36:22.541713 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.541683 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st"] Apr 24 19:36:22.544956 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.544934 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" Apr 24 19:36:22.547320 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.547298 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-11f3c-predictor-serving-cert\"" Apr 24 19:36:22.547448 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.547303 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-11f3c-kube-rbac-proxy-sar-config\"" Apr 24 19:36:22.554523 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.554496 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st"] Apr 24 19:36:22.555591 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.555569 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2976h\" (UniqueName: \"kubernetes.io/projected/f3f0c258-12af-4ace-a54c-9429480a099a-kube-api-access-2976h\") pod \"error-404-isvc-11f3c-predictor-5b494597f6-5r9st\" (UID: \"f3f0c258-12af-4ace-a54c-9429480a099a\") " pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" Apr 24 19:36:22.555676 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.555599 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8cq7\" (UniqueName: \"kubernetes.io/projected/dc6093af-e5c9-4707-9146-0e5c8e6b48b3-kube-api-access-b8cq7\") pod \"success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm\" (UID: \"dc6093af-e5c9-4707-9146-0e5c8e6b48b3\") " pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" Apr 24 19:36:22.555676 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.555641 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-11f3c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f3f0c258-12af-4ace-a54c-9429480a099a-error-404-isvc-11f3c-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-11f3c-predictor-5b494597f6-5r9st\" (UID: \"f3f0c258-12af-4ace-a54c-9429480a099a\") " pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" Apr 24 19:36:22.555676 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.555666 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc6093af-e5c9-4707-9146-0e5c8e6b48b3-proxy-tls\") pod \"success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm\" (UID: \"dc6093af-e5c9-4707-9146-0e5c8e6b48b3\") " pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" Apr 24 19:36:22.555806 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.555726 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3f0c258-12af-4ace-a54c-9429480a099a-proxy-tls\") pod \"error-404-isvc-11f3c-predictor-5b494597f6-5r9st\" (UID: \"f3f0c258-12af-4ace-a54c-9429480a099a\") " pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" Apr 24 19:36:22.555806 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.555748 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-11f3c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dc6093af-e5c9-4707-9146-0e5c8e6b48b3-success-200-isvc-11f3c-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm\" (UID: \"dc6093af-e5c9-4707-9146-0e5c8e6b48b3\") " pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" Apr 24 19:36:22.656382 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.656350 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2976h\" (UniqueName: \"kubernetes.io/projected/f3f0c258-12af-4ace-a54c-9429480a099a-kube-api-access-2976h\") pod \"error-404-isvc-11f3c-predictor-5b494597f6-5r9st\" (UID: \"f3f0c258-12af-4ace-a54c-9429480a099a\") " pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" Apr 24 19:36:22.656560 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.656386 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8cq7\" (UniqueName: \"kubernetes.io/projected/dc6093af-e5c9-4707-9146-0e5c8e6b48b3-kube-api-access-b8cq7\") pod \"success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm\" (UID: \"dc6093af-e5c9-4707-9146-0e5c8e6b48b3\") " pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" Apr 24 19:36:22.656560 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.656453 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-11f3c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f3f0c258-12af-4ace-a54c-9429480a099a-error-404-isvc-11f3c-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-11f3c-predictor-5b494597f6-5r9st\" (UID: \"f3f0c258-12af-4ace-a54c-9429480a099a\") " pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" Apr 24 19:36:22.656560 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.656481 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc6093af-e5c9-4707-9146-0e5c8e6b48b3-proxy-tls\") pod \"success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm\" (UID: \"dc6093af-e5c9-4707-9146-0e5c8e6b48b3\") " pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" Apr 24 19:36:22.656690 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.656570 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3f0c258-12af-4ace-a54c-9429480a099a-proxy-tls\") pod \"error-404-isvc-11f3c-predictor-5b494597f6-5r9st\" (UID: \"f3f0c258-12af-4ace-a54c-9429480a099a\") " pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" Apr 24 19:36:22.656690 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.656619 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-11f3c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dc6093af-e5c9-4707-9146-0e5c8e6b48b3-success-200-isvc-11f3c-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm\" (UID: \"dc6093af-e5c9-4707-9146-0e5c8e6b48b3\") " pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" Apr 24 19:36:22.656690 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:36:22.656669 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-serving-cert: secret "error-404-isvc-11f3c-predictor-serving-cert" not found Apr 24 19:36:22.656805 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:36:22.656740 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3f0c258-12af-4ace-a54c-9429480a099a-proxy-tls podName:f3f0c258-12af-4ace-a54c-9429480a099a nodeName:}" failed. No retries permitted until 2026-04-24 19:36:23.156711011 +0000 UTC m=+1820.274051391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f3f0c258-12af-4ace-a54c-9429480a099a-proxy-tls") pod "error-404-isvc-11f3c-predictor-5b494597f6-5r9st" (UID: "f3f0c258-12af-4ace-a54c-9429480a099a") : secret "error-404-isvc-11f3c-predictor-serving-cert" not found Apr 24 19:36:22.657265 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.657241 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-11f3c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dc6093af-e5c9-4707-9146-0e5c8e6b48b3-success-200-isvc-11f3c-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm\" (UID: \"dc6093af-e5c9-4707-9146-0e5c8e6b48b3\") " pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" Apr 24 19:36:22.657499 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.657480 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-11f3c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f3f0c258-12af-4ace-a54c-9429480a099a-error-404-isvc-11f3c-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-11f3c-predictor-5b494597f6-5r9st\" (UID: \"f3f0c258-12af-4ace-a54c-9429480a099a\") " pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" Apr 24 19:36:22.659105 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.659088 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc6093af-e5c9-4707-9146-0e5c8e6b48b3-proxy-tls\") pod \"success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm\" (UID: \"dc6093af-e5c9-4707-9146-0e5c8e6b48b3\") " pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" Apr 24 19:36:22.670160 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.670088 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2976h\" (UniqueName: \"kubernetes.io/projected/f3f0c258-12af-4ace-a54c-9429480a099a-kube-api-access-2976h\") pod \"error-404-isvc-11f3c-predictor-5b494597f6-5r9st\" (UID: \"f3f0c258-12af-4ace-a54c-9429480a099a\") " pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" Apr 24 19:36:22.671200 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.671180 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8cq7\" (UniqueName: \"kubernetes.io/projected/dc6093af-e5c9-4707-9146-0e5c8e6b48b3-kube-api-access-b8cq7\") pod \"success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm\" (UID: \"dc6093af-e5c9-4707-9146-0e5c8e6b48b3\") " pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" Apr 24 19:36:22.743175 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.743142 2573 generic.go:358] "Generic (PLEG): container finished" podID="597f7cde-771c-4421-a1c8-c72bafa6833b" containerID="fce2516f61dd49f3bd923607886a1f19e94ecf983215547e6ee3ed8cf15df1dd" exitCode=2 Apr 24 19:36:22.743327 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.743217 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" event={"ID":"597f7cde-771c-4421-a1c8-c72bafa6833b","Type":"ContainerDied","Data":"fce2516f61dd49f3bd923607886a1f19e94ecf983215547e6ee3ed8cf15df1dd"} Apr 24 19:36:22.744734 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.744711 2573 generic.go:358] "Generic (PLEG): container finished" podID="d84ecd56-c38d-41d2-9cc4-a0177cc734a1" containerID="9288459272d2870129fb18a36fe34980a46cdbb61bb676206817d12b6ee19471" exitCode=2 Apr 24 19:36:22.744848 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.744742 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" event={"ID":"d84ecd56-c38d-41d2-9cc4-a0177cc734a1","Type":"ContainerDied","Data":"9288459272d2870129fb18a36fe34980a46cdbb61bb676206817d12b6ee19471"} Apr 24 19:36:22.785327 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.785293 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" Apr 24 19:36:22.909323 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:22.909299 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm"] Apr 24 19:36:22.911687 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:36:22.911659 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc6093af_e5c9_4707_9146_0e5c8e6b48b3.slice/crio-98aa4b6baf312ba131fd8311eb80f91dd070976d408387afc725a35b1a96b4c4 WatchSource:0}: Error finding container 98aa4b6baf312ba131fd8311eb80f91dd070976d408387afc725a35b1a96b4c4: Status 404 returned error can't find the container with id 98aa4b6baf312ba131fd8311eb80f91dd070976d408387afc725a35b1a96b4c4 Apr 24 19:36:23.160187 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:23.160147 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3f0c258-12af-4ace-a54c-9429480a099a-proxy-tls\") pod \"error-404-isvc-11f3c-predictor-5b494597f6-5r9st\" (UID: \"f3f0c258-12af-4ace-a54c-9429480a099a\") " pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" Apr 24 19:36:23.162837 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:23.162732 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3f0c258-12af-4ace-a54c-9429480a099a-proxy-tls\") pod \"error-404-isvc-11f3c-predictor-5b494597f6-5r9st\" (UID: \"f3f0c258-12af-4ace-a54c-9429480a099a\") " pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" Apr 24 19:36:23.455605 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:23.455573 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" Apr 24 19:36:23.582615 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:23.582568 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st"] Apr 24 19:36:23.584868 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:36:23.584837 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3f0c258_12af_4ace_a54c_9429480a099a.slice/crio-386115da2209b40ef72fca12ed45757672dce541afa767bc4bd4c818065a24e2 WatchSource:0}: Error finding container 386115da2209b40ef72fca12ed45757672dce541afa767bc4bd4c818065a24e2: Status 404 returned error can't find the container with id 386115da2209b40ef72fca12ed45757672dce541afa767bc4bd4c818065a24e2 Apr 24 19:36:23.749444 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:23.749406 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" event={"ID":"f3f0c258-12af-4ace-a54c-9429480a099a","Type":"ContainerStarted","Data":"7dc60f40b9583a30e573238afb9270f2faf028c8306197a2b6739567c47fa31e"} Apr 24 19:36:23.749624 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:23.749476 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" event={"ID":"f3f0c258-12af-4ace-a54c-9429480a099a","Type":"ContainerStarted","Data":"d6d9920bf3ec5c85de843fa408a97316f1d9484d9351b4da09e410af41ca3ef2"} Apr 24 19:36:23.749624 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:23.749492 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" event={"ID":"f3f0c258-12af-4ace-a54c-9429480a099a","Type":"ContainerStarted","Data":"386115da2209b40ef72fca12ed45757672dce541afa767bc4bd4c818065a24e2"} Apr 24 19:36:23.749624 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:23.749592 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" Apr 24 19:36:23.750993 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:23.750969 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" event={"ID":"dc6093af-e5c9-4707-9146-0e5c8e6b48b3","Type":"ContainerStarted","Data":"2e54145206b68e77d9fddc2c75e9ca693e35fb79537429440fb6e9df7425ac0c"} Apr 24 19:36:23.751108 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:23.750998 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" event={"ID":"dc6093af-e5c9-4707-9146-0e5c8e6b48b3","Type":"ContainerStarted","Data":"001fdbd033385d168824d7fc7fc48f8aac36dcb127911abeb7dc4d02db7d21d4"} Apr 24 19:36:23.751108 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:23.751007 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" event={"ID":"dc6093af-e5c9-4707-9146-0e5c8e6b48b3","Type":"ContainerStarted","Data":"98aa4b6baf312ba131fd8311eb80f91dd070976d408387afc725a35b1a96b4c4"} Apr 24 19:36:23.751229 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:23.751217 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" Apr 24 19:36:23.751266 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:23.751234 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" Apr 24 19:36:23.752344 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:23.752322 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" podUID="dc6093af-e5c9-4707-9146-0e5c8e6b48b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 19:36:23.771458 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:23.768836 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" podStartSLOduration=1.7688198800000001 podStartE2EDuration="1.76881988s" podCreationTimestamp="2026-04-24 19:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:36:23.767858137 +0000 UTC m=+1820.885198527" watchObservedRunningTime="2026-04-24 19:36:23.76881988 +0000 UTC m=+1820.886160269" Apr 24 19:36:23.788349 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:23.788294 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" podStartSLOduration=1.7882766939999999 podStartE2EDuration="1.788276694s" podCreationTimestamp="2026-04-24 19:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:36:23.788018697 +0000 UTC m=+1820.905359087" watchObservedRunningTime="2026-04-24 19:36:23.788276694 +0000 UTC m=+1820.905617101" Apr 24 19:36:24.515071 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:24.515024 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" podUID="d84ecd56-c38d-41d2-9cc4-a0177cc734a1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.37:8643/healthz\": dial tcp 10.133.0.37:8643: connect: connection refused" Apr 24 19:36:24.515474 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:24.515040 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" podUID="597f7cde-771c-4421-a1c8-c72bafa6833b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.36:8643/healthz\": dial tcp 10.133.0.36:8643: connect: connection refused" Apr 24 19:36:24.753682 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:24.753649 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" Apr 24 19:36:24.753919 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:24.753886 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" podUID="dc6093af-e5c9-4707-9146-0e5c8e6b48b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 19:36:24.755004 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:24.754979 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" podUID="f3f0c258-12af-4ace-a54c-9429480a099a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 24 19:36:25.757535 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:25.757493 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" podUID="f3f0c258-12af-4ace-a54c-9429480a099a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 24 19:36:26.056179 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.056154 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" Apr 24 19:36:26.084873 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.084845 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt66b\" (UniqueName: \"kubernetes.io/projected/597f7cde-771c-4421-a1c8-c72bafa6833b-kube-api-access-lt66b\") pod \"597f7cde-771c-4421-a1c8-c72bafa6833b\" (UID: \"597f7cde-771c-4421-a1c8-c72bafa6833b\") " Apr 24 19:36:26.085031 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.084889 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/597f7cde-771c-4421-a1c8-c72bafa6833b-proxy-tls\") pod \"597f7cde-771c-4421-a1c8-c72bafa6833b\" (UID: \"597f7cde-771c-4421-a1c8-c72bafa6833b\") " Apr 24 19:36:26.085031 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.084918 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-8e956-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/597f7cde-771c-4421-a1c8-c72bafa6833b-success-200-isvc-8e956-kube-rbac-proxy-sar-config\") pod \"597f7cde-771c-4421-a1c8-c72bafa6833b\" (UID: \"597f7cde-771c-4421-a1c8-c72bafa6833b\") " Apr 24 19:36:26.085329 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.085304 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/597f7cde-771c-4421-a1c8-c72bafa6833b-success-200-isvc-8e956-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-8e956-kube-rbac-proxy-sar-config") pod "597f7cde-771c-4421-a1c8-c72bafa6833b" (UID: "597f7cde-771c-4421-a1c8-c72bafa6833b"). InnerVolumeSpecName "success-200-isvc-8e956-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:36:26.087206 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.087183 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/597f7cde-771c-4421-a1c8-c72bafa6833b-kube-api-access-lt66b" (OuterVolumeSpecName: "kube-api-access-lt66b") pod "597f7cde-771c-4421-a1c8-c72bafa6833b" (UID: "597f7cde-771c-4421-a1c8-c72bafa6833b"). InnerVolumeSpecName "kube-api-access-lt66b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:36:26.087308 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.087218 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/597f7cde-771c-4421-a1c8-c72bafa6833b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "597f7cde-771c-4421-a1c8-c72bafa6833b" (UID: "597f7cde-771c-4421-a1c8-c72bafa6833b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:36:26.186197 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.186160 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lt66b\" (UniqueName: \"kubernetes.io/projected/597f7cde-771c-4421-a1c8-c72bafa6833b-kube-api-access-lt66b\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:36:26.186197 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.186191 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/597f7cde-771c-4421-a1c8-c72bafa6833b-proxy-tls\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:36:26.186197 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.186202 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-8e956-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/597f7cde-771c-4421-a1c8-c72bafa6833b-success-200-isvc-8e956-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:36:26.515124 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.515099 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" Apr 24 19:36:26.589806 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.589774 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d84ecd56-c38d-41d2-9cc4-a0177cc734a1-proxy-tls\") pod \"d84ecd56-c38d-41d2-9cc4-a0177cc734a1\" (UID: \"d84ecd56-c38d-41d2-9cc4-a0177cc734a1\") " Apr 24 19:36:26.589969 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.589824 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-8e956-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d84ecd56-c38d-41d2-9cc4-a0177cc734a1-error-404-isvc-8e956-kube-rbac-proxy-sar-config\") pod \"d84ecd56-c38d-41d2-9cc4-a0177cc734a1\" (UID: \"d84ecd56-c38d-41d2-9cc4-a0177cc734a1\") " Apr 24 19:36:26.589969 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.589885 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wjcg\" (UniqueName: \"kubernetes.io/projected/d84ecd56-c38d-41d2-9cc4-a0177cc734a1-kube-api-access-6wjcg\") pod \"d84ecd56-c38d-41d2-9cc4-a0177cc734a1\" (UID: \"d84ecd56-c38d-41d2-9cc4-a0177cc734a1\") " Apr 24 19:36:26.590241 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.590210 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d84ecd56-c38d-41d2-9cc4-a0177cc734a1-error-404-isvc-8e956-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-8e956-kube-rbac-proxy-sar-config") pod "d84ecd56-c38d-41d2-9cc4-a0177cc734a1" (UID: "d84ecd56-c38d-41d2-9cc4-a0177cc734a1"). InnerVolumeSpecName "error-404-isvc-8e956-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:36:26.592063 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.592045 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d84ecd56-c38d-41d2-9cc4-a0177cc734a1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d84ecd56-c38d-41d2-9cc4-a0177cc734a1" (UID: "d84ecd56-c38d-41d2-9cc4-a0177cc734a1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:36:26.592134 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.592061 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d84ecd56-c38d-41d2-9cc4-a0177cc734a1-kube-api-access-6wjcg" (OuterVolumeSpecName: "kube-api-access-6wjcg") pod "d84ecd56-c38d-41d2-9cc4-a0177cc734a1" (UID: "d84ecd56-c38d-41d2-9cc4-a0177cc734a1"). InnerVolumeSpecName "kube-api-access-6wjcg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:36:26.691012 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.690930 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-8e956-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d84ecd56-c38d-41d2-9cc4-a0177cc734a1-error-404-isvc-8e956-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:36:26.691012 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.690961 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6wjcg\" (UniqueName: \"kubernetes.io/projected/d84ecd56-c38d-41d2-9cc4-a0177cc734a1-kube-api-access-6wjcg\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:36:26.691012 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.690971 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d84ecd56-c38d-41d2-9cc4-a0177cc734a1-proxy-tls\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:36:26.761299 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.761267 2573 generic.go:358] "Generic (PLEG): container finished" podID="d84ecd56-c38d-41d2-9cc4-a0177cc734a1" containerID="7137870406a43f2c86a329e91b354a7ca30cf45a6372f8d4971a9f32f71eeb92" exitCode=0 Apr 24 19:36:26.761723 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.761342 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" Apr 24 19:36:26.761723 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.761359 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" event={"ID":"d84ecd56-c38d-41d2-9cc4-a0177cc734a1","Type":"ContainerDied","Data":"7137870406a43f2c86a329e91b354a7ca30cf45a6372f8d4971a9f32f71eeb92"} Apr 24 19:36:26.761723 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.761394 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt" event={"ID":"d84ecd56-c38d-41d2-9cc4-a0177cc734a1","Type":"ContainerDied","Data":"a567ac9f1356c0c98b8c6475ab523cfcdff347e256aead63f17d4c57483d27a1"} Apr 24 19:36:26.761723 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.761408 2573 scope.go:117] "RemoveContainer" containerID="9288459272d2870129fb18a36fe34980a46cdbb61bb676206817d12b6ee19471" Apr 24 19:36:26.762854 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.762828 2573 generic.go:358] "Generic (PLEG): container finished" podID="597f7cde-771c-4421-a1c8-c72bafa6833b" containerID="72757a1ba3c81d248efbe04133493ee08120940af612572fda899a79ca7c5325" exitCode=0 Apr 24 19:36:26.763024 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.762879 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" event={"ID":"597f7cde-771c-4421-a1c8-c72bafa6833b","Type":"ContainerDied","Data":"72757a1ba3c81d248efbe04133493ee08120940af612572fda899a79ca7c5325"} Apr 24 19:36:26.763024 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.762904 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" event={"ID":"597f7cde-771c-4421-a1c8-c72bafa6833b","Type":"ContainerDied","Data":"8ff025d2d2e76611a6fa7e1bda0e8358e521fd4031cb3fb36fa0630ec09bd0fb"} Apr 24 19:36:26.763024 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.762990 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn" Apr 24 19:36:26.775179 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.775156 2573 scope.go:117] "RemoveContainer" containerID="7137870406a43f2c86a329e91b354a7ca30cf45a6372f8d4971a9f32f71eeb92" Apr 24 19:36:26.782517 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.782499 2573 scope.go:117] "RemoveContainer" containerID="9288459272d2870129fb18a36fe34980a46cdbb61bb676206817d12b6ee19471" Apr 24 19:36:26.782777 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:36:26.782755 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9288459272d2870129fb18a36fe34980a46cdbb61bb676206817d12b6ee19471\": container with ID starting with 9288459272d2870129fb18a36fe34980a46cdbb61bb676206817d12b6ee19471 not found: ID does not exist" containerID="9288459272d2870129fb18a36fe34980a46cdbb61bb676206817d12b6ee19471" Apr 24 19:36:26.782835 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.782789 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9288459272d2870129fb18a36fe34980a46cdbb61bb676206817d12b6ee19471"} err="failed to get container status \"9288459272d2870129fb18a36fe34980a46cdbb61bb676206817d12b6ee19471\": rpc error: code = NotFound desc = could not find container \"9288459272d2870129fb18a36fe34980a46cdbb61bb676206817d12b6ee19471\": container with ID starting with 9288459272d2870129fb18a36fe34980a46cdbb61bb676206817d12b6ee19471 not found: ID does not exist" Apr 24 19:36:26.782835 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.782812 2573 scope.go:117] "RemoveContainer" containerID="7137870406a43f2c86a329e91b354a7ca30cf45a6372f8d4971a9f32f71eeb92" Apr 24 19:36:26.783039 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:36:26.783019 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7137870406a43f2c86a329e91b354a7ca30cf45a6372f8d4971a9f32f71eeb92\": container with ID starting with 7137870406a43f2c86a329e91b354a7ca30cf45a6372f8d4971a9f32f71eeb92 not found: ID does not exist" containerID="7137870406a43f2c86a329e91b354a7ca30cf45a6372f8d4971a9f32f71eeb92" Apr 24 19:36:26.783085 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.783046 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7137870406a43f2c86a329e91b354a7ca30cf45a6372f8d4971a9f32f71eeb92"} err="failed to get container status \"7137870406a43f2c86a329e91b354a7ca30cf45a6372f8d4971a9f32f71eeb92\": rpc error: code = NotFound desc = could not find container \"7137870406a43f2c86a329e91b354a7ca30cf45a6372f8d4971a9f32f71eeb92\": container with ID starting with 7137870406a43f2c86a329e91b354a7ca30cf45a6372f8d4971a9f32f71eeb92 not found: ID does not exist" Apr 24 19:36:26.783085 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.783062 2573 scope.go:117] "RemoveContainer" containerID="fce2516f61dd49f3bd923607886a1f19e94ecf983215547e6ee3ed8cf15df1dd" Apr 24 19:36:26.787964 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.787945 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt"] Apr 24 19:36:26.790536 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.790505 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8e956-predictor-579998b9d-qv6bt"] Apr 24 19:36:26.790990 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.790978 2573 scope.go:117] "RemoveContainer" containerID="72757a1ba3c81d248efbe04133493ee08120940af612572fda899a79ca7c5325" Apr 24 19:36:26.797926 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.797912 2573 scope.go:117] "RemoveContainer" containerID="fce2516f61dd49f3bd923607886a1f19e94ecf983215547e6ee3ed8cf15df1dd" Apr 24 19:36:26.798135 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:36:26.798120 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fce2516f61dd49f3bd923607886a1f19e94ecf983215547e6ee3ed8cf15df1dd\": container with ID starting with fce2516f61dd49f3bd923607886a1f19e94ecf983215547e6ee3ed8cf15df1dd not found: ID does not exist" containerID="fce2516f61dd49f3bd923607886a1f19e94ecf983215547e6ee3ed8cf15df1dd" Apr 24 19:36:26.798181 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.798141 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fce2516f61dd49f3bd923607886a1f19e94ecf983215547e6ee3ed8cf15df1dd"} err="failed to get container status \"fce2516f61dd49f3bd923607886a1f19e94ecf983215547e6ee3ed8cf15df1dd\": rpc error: code = NotFound desc = could not find container \"fce2516f61dd49f3bd923607886a1f19e94ecf983215547e6ee3ed8cf15df1dd\": container with ID starting with fce2516f61dd49f3bd923607886a1f19e94ecf983215547e6ee3ed8cf15df1dd not found: ID does not exist" Apr 24 19:36:26.798181 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.798156 2573 scope.go:117] "RemoveContainer" containerID="72757a1ba3c81d248efbe04133493ee08120940af612572fda899a79ca7c5325" Apr 24 19:36:26.798391 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:36:26.798375 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72757a1ba3c81d248efbe04133493ee08120940af612572fda899a79ca7c5325\": container with ID starting with 72757a1ba3c81d248efbe04133493ee08120940af612572fda899a79ca7c5325 not found: ID does not exist" containerID="72757a1ba3c81d248efbe04133493ee08120940af612572fda899a79ca7c5325" Apr 24 19:36:26.798465 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.798400 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72757a1ba3c81d248efbe04133493ee08120940af612572fda899a79ca7c5325"} err="failed to get container status \"72757a1ba3c81d248efbe04133493ee08120940af612572fda899a79ca7c5325\": rpc error: code = NotFound desc = could not find container \"72757a1ba3c81d248efbe04133493ee08120940af612572fda899a79ca7c5325\": container with ID starting with 72757a1ba3c81d248efbe04133493ee08120940af612572fda899a79ca7c5325 not found: ID does not exist" Apr 24 19:36:26.800412 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.800392 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn"] Apr 24 19:36:26.804852 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:26.804833 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8e956-predictor-b8b4d68c7-s2pwn"] Apr 24 19:36:27.379857 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:27.379826 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="597f7cde-771c-4421-a1c8-c72bafa6833b" path="/var/lib/kubelet/pods/597f7cde-771c-4421-a1c8-c72bafa6833b/volumes" Apr 24 19:36:27.380252 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:27.380239 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d84ecd56-c38d-41d2-9cc4-a0177cc734a1" path="/var/lib/kubelet/pods/d84ecd56-c38d-41d2-9cc4-a0177cc734a1/volumes" Apr 24 19:36:28.616131 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:28.616090 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" podUID="39c2c5c2-4106-44f1-9132-cc5e3065e04a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 24 19:36:29.620454 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:29.620405 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" Apr 24 19:36:29.758643 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:29.758617 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" Apr 24 19:36:29.759049 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:29.759023 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" podUID="dc6093af-e5c9-4707-9146-0e5c8e6b48b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 19:36:30.762137 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:30.762106 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" Apr 24 19:36:30.762668 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:30.762637 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" podUID="f3f0c258-12af-4ace-a54c-9429480a099a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 24 19:36:38.616881 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:38.616853 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" Apr 24 19:36:39.759162 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:39.759125 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" podUID="dc6093af-e5c9-4707-9146-0e5c8e6b48b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 19:36:40.763121 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:40.763080 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" podUID="f3f0c258-12af-4ace-a54c-9429480a099a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 24 19:36:49.759361 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:49.759325 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" podUID="dc6093af-e5c9-4707-9146-0e5c8e6b48b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 19:36:50.763494 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:50.763449 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" podUID="f3f0c258-12af-4ace-a54c-9429480a099a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 24 19:36:59.759688 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:36:59.759651 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" podUID="dc6093af-e5c9-4707-9146-0e5c8e6b48b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 19:37:00.762814 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:37:00.762772 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" podUID="f3f0c258-12af-4ace-a54c-9429480a099a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 24 19:37:09.760062 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:37:09.760034 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" Apr 24 19:37:10.763108 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:37:10.763073 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" Apr 24 19:45:37.228714 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:37.228686 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm"] Apr 24 19:45:37.231157 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:37.228941 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" podUID="dc6093af-e5c9-4707-9146-0e5c8e6b48b3" containerName="kserve-container" containerID="cri-o://001fdbd033385d168824d7fc7fc48f8aac36dcb127911abeb7dc4d02db7d21d4" gracePeriod=30 Apr 24 19:45:37.231157 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:37.229000 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" podUID="dc6093af-e5c9-4707-9146-0e5c8e6b48b3" containerName="kube-rbac-proxy" containerID="cri-o://2e54145206b68e77d9fddc2c75e9ca693e35fb79537429440fb6e9df7425ac0c" gracePeriod=30 Apr 24 19:45:37.266973 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:37.266946 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st"] Apr 24 19:45:37.267241 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:37.267202 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" podUID="f3f0c258-12af-4ace-a54c-9429480a099a" containerName="kserve-container" containerID="cri-o://d6d9920bf3ec5c85de843fa408a97316f1d9484d9351b4da09e410af41ca3ef2" gracePeriod=30 Apr 24 19:45:37.267314 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:37.267237 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" podUID="f3f0c258-12af-4ace-a54c-9429480a099a" containerName="kube-rbac-proxy" containerID="cri-o://7dc60f40b9583a30e573238afb9270f2faf028c8306197a2b6739567c47fa31e" gracePeriod=30 Apr 24 19:45:38.300038 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:38.300002 2573 generic.go:358] "Generic (PLEG): container finished" podID="f3f0c258-12af-4ace-a54c-9429480a099a" containerID="7dc60f40b9583a30e573238afb9270f2faf028c8306197a2b6739567c47fa31e" exitCode=2 Apr 24 19:45:38.300530 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:38.300076 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" event={"ID":"f3f0c258-12af-4ace-a54c-9429480a099a","Type":"ContainerDied","Data":"7dc60f40b9583a30e573238afb9270f2faf028c8306197a2b6739567c47fa31e"} Apr 24 19:45:38.301625 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:38.301601 2573 generic.go:358] "Generic (PLEG): container finished" podID="dc6093af-e5c9-4707-9146-0e5c8e6b48b3" containerID="2e54145206b68e77d9fddc2c75e9ca693e35fb79537429440fb6e9df7425ac0c" exitCode=2 Apr 24 19:45:38.301736 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:38.301661 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" event={"ID":"dc6093af-e5c9-4707-9146-0e5c8e6b48b3","Type":"ContainerDied","Data":"2e54145206b68e77d9fddc2c75e9ca693e35fb79537429440fb6e9df7425ac0c"} Apr 24 19:45:39.754202 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:39.754158 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" podUID="dc6093af-e5c9-4707-9146-0e5c8e6b48b3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.40:8643/healthz\": dial tcp 10.133.0.40:8643: connect: connection refused" Apr 24 19:45:39.759521 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:39.759496 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" podUID="dc6093af-e5c9-4707-9146-0e5c8e6b48b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 19:45:40.309178 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:40.309147 2573 generic.go:358] "Generic (PLEG): container finished" podID="f3f0c258-12af-4ace-a54c-9429480a099a" containerID="d6d9920bf3ec5c85de843fa408a97316f1d9484d9351b4da09e410af41ca3ef2" exitCode=0 Apr 24 19:45:40.309361 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:40.309224 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" event={"ID":"f3f0c258-12af-4ace-a54c-9429480a099a","Type":"ContainerDied","Data":"d6d9920bf3ec5c85de843fa408a97316f1d9484d9351b4da09e410af41ca3ef2"} Apr 24 19:45:40.347507 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:40.347482 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" Apr 24 19:45:40.475772 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:40.475749 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" Apr 24 19:45:40.538997 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:40.538959 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-11f3c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f3f0c258-12af-4ace-a54c-9429480a099a-error-404-isvc-11f3c-kube-rbac-proxy-sar-config\") pod \"f3f0c258-12af-4ace-a54c-9429480a099a\" (UID: \"f3f0c258-12af-4ace-a54c-9429480a099a\") " Apr 24 19:45:40.539173 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:40.539028 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3f0c258-12af-4ace-a54c-9429480a099a-proxy-tls\") pod \"f3f0c258-12af-4ace-a54c-9429480a099a\" (UID: \"f3f0c258-12af-4ace-a54c-9429480a099a\") " Apr 24 19:45:40.539227 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:40.539190 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2976h\" (UniqueName: \"kubernetes.io/projected/f3f0c258-12af-4ace-a54c-9429480a099a-kube-api-access-2976h\") pod \"f3f0c258-12af-4ace-a54c-9429480a099a\" (UID: \"f3f0c258-12af-4ace-a54c-9429480a099a\") " Apr 24 19:45:40.539388 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:40.539361 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3f0c258-12af-4ace-a54c-9429480a099a-error-404-isvc-11f3c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-11f3c-kube-rbac-proxy-sar-config") pod "f3f0c258-12af-4ace-a54c-9429480a099a" (UID: "f3f0c258-12af-4ace-a54c-9429480a099a"). InnerVolumeSpecName "error-404-isvc-11f3c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:45:40.541287 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:40.541263 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3f0c258-12af-4ace-a54c-9429480a099a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f3f0c258-12af-4ace-a54c-9429480a099a" (UID: "f3f0c258-12af-4ace-a54c-9429480a099a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:45:40.541372 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:40.541296 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3f0c258-12af-4ace-a54c-9429480a099a-kube-api-access-2976h" (OuterVolumeSpecName: "kube-api-access-2976h") pod "f3f0c258-12af-4ace-a54c-9429480a099a" (UID: "f3f0c258-12af-4ace-a54c-9429480a099a"). InnerVolumeSpecName "kube-api-access-2976h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:45:40.640680 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:40.640631 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8cq7\" (UniqueName: \"kubernetes.io/projected/dc6093af-e5c9-4707-9146-0e5c8e6b48b3-kube-api-access-b8cq7\") pod \"dc6093af-e5c9-4707-9146-0e5c8e6b48b3\" (UID: \"dc6093af-e5c9-4707-9146-0e5c8e6b48b3\") " Apr 24 19:45:40.640680 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:40.640688 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc6093af-e5c9-4707-9146-0e5c8e6b48b3-proxy-tls\") pod \"dc6093af-e5c9-4707-9146-0e5c8e6b48b3\" (UID: \"dc6093af-e5c9-4707-9146-0e5c8e6b48b3\") " Apr 24 19:45:40.640863 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:40.640787 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-11f3c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dc6093af-e5c9-4707-9146-0e5c8e6b48b3-success-200-isvc-11f3c-kube-rbac-proxy-sar-config\") pod \"dc6093af-e5c9-4707-9146-0e5c8e6b48b3\" (UID: \"dc6093af-e5c9-4707-9146-0e5c8e6b48b3\") " Apr 24 19:45:40.641010 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:40.640994 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2976h\" (UniqueName: \"kubernetes.io/projected/f3f0c258-12af-4ace-a54c-9429480a099a-kube-api-access-2976h\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:45:40.641053 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:40.641017 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-11f3c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f3f0c258-12af-4ace-a54c-9429480a099a-error-404-isvc-11f3c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:45:40.641053 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:40.641033 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3f0c258-12af-4ace-a54c-9429480a099a-proxy-tls\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:45:40.641149 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:40.641124 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc6093af-e5c9-4707-9146-0e5c8e6b48b3-success-200-isvc-11f3c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-11f3c-kube-rbac-proxy-sar-config") pod "dc6093af-e5c9-4707-9146-0e5c8e6b48b3" (UID: "dc6093af-e5c9-4707-9146-0e5c8e6b48b3"). InnerVolumeSpecName "success-200-isvc-11f3c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:45:40.642970 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:40.642945 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc6093af-e5c9-4707-9146-0e5c8e6b48b3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "dc6093af-e5c9-4707-9146-0e5c8e6b48b3" (UID: "dc6093af-e5c9-4707-9146-0e5c8e6b48b3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:45:40.643049 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:40.643028 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc6093af-e5c9-4707-9146-0e5c8e6b48b3-kube-api-access-b8cq7" (OuterVolumeSpecName: "kube-api-access-b8cq7") pod "dc6093af-e5c9-4707-9146-0e5c8e6b48b3" (UID: "dc6093af-e5c9-4707-9146-0e5c8e6b48b3"). InnerVolumeSpecName "kube-api-access-b8cq7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:45:40.741776 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:40.741682 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-11f3c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dc6093af-e5c9-4707-9146-0e5c8e6b48b3-success-200-isvc-11f3c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:45:40.741776 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:40.741716 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b8cq7\" (UniqueName: \"kubernetes.io/projected/dc6093af-e5c9-4707-9146-0e5c8e6b48b3-kube-api-access-b8cq7\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:45:40.741776 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:40.741726 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc6093af-e5c9-4707-9146-0e5c8e6b48b3-proxy-tls\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:45:41.313634 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:41.313607 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" Apr 24 19:45:41.314077 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:41.313605 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st" event={"ID":"f3f0c258-12af-4ace-a54c-9429480a099a","Type":"ContainerDied","Data":"386115da2209b40ef72fca12ed45757672dce541afa767bc4bd4c818065a24e2"} Apr 24 19:45:41.314077 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:41.313739 2573 scope.go:117] "RemoveContainer" containerID="7dc60f40b9583a30e573238afb9270f2faf028c8306197a2b6739567c47fa31e" Apr 24 19:45:41.315110 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:41.315065 2573 generic.go:358] "Generic (PLEG): container finished" podID="dc6093af-e5c9-4707-9146-0e5c8e6b48b3" containerID="001fdbd033385d168824d7fc7fc48f8aac36dcb127911abeb7dc4d02db7d21d4" exitCode=0 Apr 24 19:45:41.315110 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:41.315094 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" event={"ID":"dc6093af-e5c9-4707-9146-0e5c8e6b48b3","Type":"ContainerDied","Data":"001fdbd033385d168824d7fc7fc48f8aac36dcb127911abeb7dc4d02db7d21d4"} Apr 24 19:45:41.315266 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:41.315127 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" event={"ID":"dc6093af-e5c9-4707-9146-0e5c8e6b48b3","Type":"ContainerDied","Data":"98aa4b6baf312ba131fd8311eb80f91dd070976d408387afc725a35b1a96b4c4"} Apr 24 19:45:41.315266 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:41.315149 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm" Apr 24 19:45:41.322344 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:41.322310 2573 scope.go:117] "RemoveContainer" containerID="d6d9920bf3ec5c85de843fa408a97316f1d9484d9351b4da09e410af41ca3ef2" Apr 24 19:45:41.329830 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:41.329815 2573 scope.go:117] "RemoveContainer" containerID="2e54145206b68e77d9fddc2c75e9ca693e35fb79537429440fb6e9df7425ac0c" Apr 24 19:45:41.337079 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:41.337063 2573 scope.go:117] "RemoveContainer" containerID="001fdbd033385d168824d7fc7fc48f8aac36dcb127911abeb7dc4d02db7d21d4" Apr 24 19:45:41.338115 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:41.338098 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm"] Apr 24 19:45:41.342625 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:41.342606 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-11f3c-predictor-b9f5b46d7-btmfm"] Apr 24 19:45:41.344824 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:41.344808 2573 scope.go:117] "RemoveContainer" containerID="2e54145206b68e77d9fddc2c75e9ca693e35fb79537429440fb6e9df7425ac0c" Apr 24 19:45:41.345081 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:45:41.345054 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e54145206b68e77d9fddc2c75e9ca693e35fb79537429440fb6e9df7425ac0c\": container with ID starting with 2e54145206b68e77d9fddc2c75e9ca693e35fb79537429440fb6e9df7425ac0c not found: ID does not exist" containerID="2e54145206b68e77d9fddc2c75e9ca693e35fb79537429440fb6e9df7425ac0c" Apr 24 19:45:41.345172 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:41.345084 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e54145206b68e77d9fddc2c75e9ca693e35fb79537429440fb6e9df7425ac0c"} err="failed to get container status \"2e54145206b68e77d9fddc2c75e9ca693e35fb79537429440fb6e9df7425ac0c\": rpc error: code = NotFound desc = could not find container \"2e54145206b68e77d9fddc2c75e9ca693e35fb79537429440fb6e9df7425ac0c\": container with ID starting with 2e54145206b68e77d9fddc2c75e9ca693e35fb79537429440fb6e9df7425ac0c not found: ID does not exist" Apr 24 19:45:41.345172 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:41.345102 2573 scope.go:117] "RemoveContainer" containerID="001fdbd033385d168824d7fc7fc48f8aac36dcb127911abeb7dc4d02db7d21d4" Apr 24 19:45:41.345328 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:45:41.345312 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"001fdbd033385d168824d7fc7fc48f8aac36dcb127911abeb7dc4d02db7d21d4\": container with ID starting with 001fdbd033385d168824d7fc7fc48f8aac36dcb127911abeb7dc4d02db7d21d4 not found: ID does not exist" containerID="001fdbd033385d168824d7fc7fc48f8aac36dcb127911abeb7dc4d02db7d21d4" Apr 24 19:45:41.345381 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:41.345332 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"001fdbd033385d168824d7fc7fc48f8aac36dcb127911abeb7dc4d02db7d21d4"} err="failed to get container status \"001fdbd033385d168824d7fc7fc48f8aac36dcb127911abeb7dc4d02db7d21d4\": rpc error: code = NotFound desc = could not find container \"001fdbd033385d168824d7fc7fc48f8aac36dcb127911abeb7dc4d02db7d21d4\": container with ID starting with 001fdbd033385d168824d7fc7fc48f8aac36dcb127911abeb7dc4d02db7d21d4 not found: ID does not exist" Apr 24 19:45:41.353102 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:41.353080 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st"] Apr 24 19:45:41.355156 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:41.355137 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-11f3c-predictor-5b494597f6-5r9st"] Apr 24 19:45:41.379695 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:41.379669 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc6093af-e5c9-4707-9146-0e5c8e6b48b3" path="/var/lib/kubelet/pods/dc6093af-e5c9-4707-9146-0e5c8e6b48b3/volumes" Apr 24 19:45:41.380070 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:45:41.380058 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3f0c258-12af-4ace-a54c-9429480a099a" path="/var/lib/kubelet/pods/f3f0c258-12af-4ace-a54c-9429480a099a/volumes" Apr 24 19:53:11.482782 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:11.482745 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj"] Apr 24 19:53:11.485105 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:11.483040 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" podUID="39c2c5c2-4106-44f1-9132-cc5e3065e04a" containerName="kserve-container" containerID="cri-o://1423ef703815f3f51d3cf071f589247cfb4bd33a6db5eba2397e17cf9342bd00" gracePeriod=30 Apr 24 19:53:11.485105 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:11.483063 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" podUID="39c2c5c2-4106-44f1-9132-cc5e3065e04a" containerName="kube-rbac-proxy" containerID="cri-o://a8b1f5d1584ccbd6ebc269350b7ad886a7f94458f95090e247dedbec478cdb3c" gracePeriod=30 Apr 24 19:53:11.530653 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:11.530624 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc"] Apr 24 19:53:11.530917 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:11.530884 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" podUID="76541b04-b9fb-4b4d-b9c6-e9487f3a1b80" containerName="kserve-container" containerID="cri-o://da9331c428514f84e336cd605a2abb673834364cb157a3cde45107a6f3049fb0" gracePeriod=30 Apr 24 19:53:11.531046 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:11.530938 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" podUID="76541b04-b9fb-4b4d-b9c6-e9487f3a1b80" containerName="kube-rbac-proxy" containerID="cri-o://357efcd04511f0ef09f6c6d898862eb9aa6be58c8ad5efb670de7370dd287401" gracePeriod=30 Apr 24 19:53:12.554257 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:12.554221 2573 generic.go:358] "Generic (PLEG): container finished" podID="39c2c5c2-4106-44f1-9132-cc5e3065e04a" containerID="a8b1f5d1584ccbd6ebc269350b7ad886a7f94458f95090e247dedbec478cdb3c" exitCode=2 Apr 24 19:53:12.554719 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:12.554288 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" event={"ID":"39c2c5c2-4106-44f1-9132-cc5e3065e04a","Type":"ContainerDied","Data":"a8b1f5d1584ccbd6ebc269350b7ad886a7f94458f95090e247dedbec478cdb3c"} Apr 24 19:53:12.555725 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:12.555701 2573 generic.go:358] "Generic (PLEG): container finished" podID="76541b04-b9fb-4b4d-b9c6-e9487f3a1b80" containerID="357efcd04511f0ef09f6c6d898862eb9aa6be58c8ad5efb670de7370dd287401" exitCode=2 Apr 24 19:53:12.555833 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:12.555740 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" event={"ID":"76541b04-b9fb-4b4d-b9c6-e9487f3a1b80","Type":"ContainerDied","Data":"357efcd04511f0ef09f6c6d898862eb9aa6be58c8ad5efb670de7370dd287401"} Apr 24 19:53:13.611704 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:13.611656 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" podUID="39c2c5c2-4106-44f1-9132-cc5e3065e04a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.38:8643/healthz\": dial tcp 10.133.0.38:8643: connect: connection refused" Apr 24 19:53:14.434789 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.434768 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" Apr 24 19:53:14.521340 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.521267 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptp54\" (UniqueName: \"kubernetes.io/projected/39c2c5c2-4106-44f1-9132-cc5e3065e04a-kube-api-access-ptp54\") pod \"39c2c5c2-4106-44f1-9132-cc5e3065e04a\" (UID: \"39c2c5c2-4106-44f1-9132-cc5e3065e04a\") " Apr 24 19:53:14.521340 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.521329 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/39c2c5c2-4106-44f1-9132-cc5e3065e04a-proxy-tls\") pod \"39c2c5c2-4106-44f1-9132-cc5e3065e04a\" (UID: \"39c2c5c2-4106-44f1-9132-cc5e3065e04a\") " Apr 24 19:53:14.521566 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.521363 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-a0351-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/39c2c5c2-4106-44f1-9132-cc5e3065e04a-success-200-isvc-a0351-kube-rbac-proxy-sar-config\") pod \"39c2c5c2-4106-44f1-9132-cc5e3065e04a\" (UID: \"39c2c5c2-4106-44f1-9132-cc5e3065e04a\") " Apr 24 19:53:14.521721 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.521689 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39c2c5c2-4106-44f1-9132-cc5e3065e04a-success-200-isvc-a0351-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-a0351-kube-rbac-proxy-sar-config") pod "39c2c5c2-4106-44f1-9132-cc5e3065e04a" (UID: "39c2c5c2-4106-44f1-9132-cc5e3065e04a"). InnerVolumeSpecName "success-200-isvc-a0351-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:53:14.523328 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.523307 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39c2c5c2-4106-44f1-9132-cc5e3065e04a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "39c2c5c2-4106-44f1-9132-cc5e3065e04a" (UID: "39c2c5c2-4106-44f1-9132-cc5e3065e04a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:53:14.523462 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.523417 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39c2c5c2-4106-44f1-9132-cc5e3065e04a-kube-api-access-ptp54" (OuterVolumeSpecName: "kube-api-access-ptp54") pod "39c2c5c2-4106-44f1-9132-cc5e3065e04a" (UID: "39c2c5c2-4106-44f1-9132-cc5e3065e04a"). InnerVolumeSpecName "kube-api-access-ptp54". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:53:14.567578 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.567549 2573 generic.go:358] "Generic (PLEG): container finished" podID="39c2c5c2-4106-44f1-9132-cc5e3065e04a" containerID="1423ef703815f3f51d3cf071f589247cfb4bd33a6db5eba2397e17cf9342bd00" exitCode=0 Apr 24 19:53:14.567707 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.567613 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" event={"ID":"39c2c5c2-4106-44f1-9132-cc5e3065e04a","Type":"ContainerDied","Data":"1423ef703815f3f51d3cf071f589247cfb4bd33a6db5eba2397e17cf9342bd00"} Apr 24 19:53:14.567707 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.567637 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" event={"ID":"39c2c5c2-4106-44f1-9132-cc5e3065e04a","Type":"ContainerDied","Data":"182a28698a7c976b89e7bff0e20c88636aeeee964a02a2aade81c47b6f04558e"} Apr 24 19:53:14.567707 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.567657 2573 scope.go:117] "RemoveContainer" containerID="a8b1f5d1584ccbd6ebc269350b7ad886a7f94458f95090e247dedbec478cdb3c" Apr 24 19:53:14.567707 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.567659 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj" Apr 24 19:53:14.569292 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.569266 2573 generic.go:358] "Generic (PLEG): container finished" podID="76541b04-b9fb-4b4d-b9c6-e9487f3a1b80" containerID="da9331c428514f84e336cd605a2abb673834364cb157a3cde45107a6f3049fb0" exitCode=0 Apr 24 19:53:14.569404 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.569301 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" event={"ID":"76541b04-b9fb-4b4d-b9c6-e9487f3a1b80","Type":"ContainerDied","Data":"da9331c428514f84e336cd605a2abb673834364cb157a3cde45107a6f3049fb0"} Apr 24 19:53:14.575617 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.575599 2573 scope.go:117] "RemoveContainer" containerID="1423ef703815f3f51d3cf071f589247cfb4bd33a6db5eba2397e17cf9342bd00" Apr 24 19:53:14.582187 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.582170 2573 scope.go:117] "RemoveContainer" containerID="a8b1f5d1584ccbd6ebc269350b7ad886a7f94458f95090e247dedbec478cdb3c" Apr 24 19:53:14.582445 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:53:14.582412 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8b1f5d1584ccbd6ebc269350b7ad886a7f94458f95090e247dedbec478cdb3c\": container with ID starting with a8b1f5d1584ccbd6ebc269350b7ad886a7f94458f95090e247dedbec478cdb3c not found: ID does not exist" containerID="a8b1f5d1584ccbd6ebc269350b7ad886a7f94458f95090e247dedbec478cdb3c" Apr 24 19:53:14.582504 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.582455 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8b1f5d1584ccbd6ebc269350b7ad886a7f94458f95090e247dedbec478cdb3c"} err="failed to get container status \"a8b1f5d1584ccbd6ebc269350b7ad886a7f94458f95090e247dedbec478cdb3c\": rpc error: code = NotFound desc = could not find container \"a8b1f5d1584ccbd6ebc269350b7ad886a7f94458f95090e247dedbec478cdb3c\": container with ID starting with a8b1f5d1584ccbd6ebc269350b7ad886a7f94458f95090e247dedbec478cdb3c not found: ID does not exist" Apr 24 19:53:14.582504 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.582472 2573 scope.go:117] "RemoveContainer" containerID="1423ef703815f3f51d3cf071f589247cfb4bd33a6db5eba2397e17cf9342bd00" Apr 24 19:53:14.582693 ip-10-0-138-6 kubenswrapper[2573]: E0424 19:53:14.582674 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1423ef703815f3f51d3cf071f589247cfb4bd33a6db5eba2397e17cf9342bd00\": container with ID starting with 1423ef703815f3f51d3cf071f589247cfb4bd33a6db5eba2397e17cf9342bd00 not found: ID does not exist" containerID="1423ef703815f3f51d3cf071f589247cfb4bd33a6db5eba2397e17cf9342bd00" Apr 24 19:53:14.582749 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.582701 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1423ef703815f3f51d3cf071f589247cfb4bd33a6db5eba2397e17cf9342bd00"} err="failed to get container status \"1423ef703815f3f51d3cf071f589247cfb4bd33a6db5eba2397e17cf9342bd00\": rpc error: code = NotFound desc = could not find container \"1423ef703815f3f51d3cf071f589247cfb4bd33a6db5eba2397e17cf9342bd00\": container with ID starting with 1423ef703815f3f51d3cf071f589247cfb4bd33a6db5eba2397e17cf9342bd00 not found: ID does not exist" Apr 24 19:53:14.587831 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.587812 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj"] Apr 24 19:53:14.589816 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.589797 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a0351-predictor-575fcb594c-rxptj"] Apr 24 19:53:14.615807 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.615783 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" podUID="76541b04-b9fb-4b4d-b9c6-e9487f3a1b80" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.39:8643/healthz\": dial tcp 10.133.0.39:8643: connect: connection refused" Apr 24 19:53:14.622629 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.622614 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/39c2c5c2-4106-44f1-9132-cc5e3065e04a-proxy-tls\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:53:14.622684 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.622634 2573 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-a0351-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/39c2c5c2-4106-44f1-9132-cc5e3065e04a-success-200-isvc-a0351-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:53:14.622684 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.622644 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ptp54\" (UniqueName: \"kubernetes.io/projected/39c2c5c2-4106-44f1-9132-cc5e3065e04a-kube-api-access-ptp54\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:53:14.959659 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:14.959638 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" Apr 24 19:53:15.025704 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:15.025678 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76541b04-b9fb-4b4d-b9c6-e9487f3a1b80-proxy-tls\") pod \"76541b04-b9fb-4b4d-b9c6-e9487f3a1b80\" (UID: \"76541b04-b9fb-4b4d-b9c6-e9487f3a1b80\") " Apr 24 19:53:15.025873 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:15.025724 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44w65\" (UniqueName: \"kubernetes.io/projected/76541b04-b9fb-4b4d-b9c6-e9487f3a1b80-kube-api-access-44w65\") pod \"76541b04-b9fb-4b4d-b9c6-e9487f3a1b80\" (UID: \"76541b04-b9fb-4b4d-b9c6-e9487f3a1b80\") " Apr 24 19:53:15.025873 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:15.025752 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-a0351-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/76541b04-b9fb-4b4d-b9c6-e9487f3a1b80-error-404-isvc-a0351-kube-rbac-proxy-sar-config\") pod \"76541b04-b9fb-4b4d-b9c6-e9487f3a1b80\" (UID: \"76541b04-b9fb-4b4d-b9c6-e9487f3a1b80\") " Apr 24 19:53:15.026163 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:15.026130 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76541b04-b9fb-4b4d-b9c6-e9487f3a1b80-error-404-isvc-a0351-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-a0351-kube-rbac-proxy-sar-config") pod "76541b04-b9fb-4b4d-b9c6-e9487f3a1b80" (UID: "76541b04-b9fb-4b4d-b9c6-e9487f3a1b80"). InnerVolumeSpecName "error-404-isvc-a0351-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:53:15.027876 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:15.027856 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76541b04-b9fb-4b4d-b9c6-e9487f3a1b80-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "76541b04-b9fb-4b4d-b9c6-e9487f3a1b80" (UID: "76541b04-b9fb-4b4d-b9c6-e9487f3a1b80"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:53:15.027970 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:15.027928 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76541b04-b9fb-4b4d-b9c6-e9487f3a1b80-kube-api-access-44w65" (OuterVolumeSpecName: "kube-api-access-44w65") pod "76541b04-b9fb-4b4d-b9c6-e9487f3a1b80" (UID: "76541b04-b9fb-4b4d-b9c6-e9487f3a1b80"). InnerVolumeSpecName "kube-api-access-44w65". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:53:15.126472 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:15.126370 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76541b04-b9fb-4b4d-b9c6-e9487f3a1b80-proxy-tls\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:53:15.126472 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:15.126398 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-44w65\" (UniqueName: \"kubernetes.io/projected/76541b04-b9fb-4b4d-b9c6-e9487f3a1b80-kube-api-access-44w65\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:53:15.126472 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:15.126413 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-a0351-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/76541b04-b9fb-4b4d-b9c6-e9487f3a1b80-error-404-isvc-a0351-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-6.ec2.internal\" DevicePath \"\"" Apr 24 19:53:15.380024 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:15.379949 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39c2c5c2-4106-44f1-9132-cc5e3065e04a" path="/var/lib/kubelet/pods/39c2c5c2-4106-44f1-9132-cc5e3065e04a/volumes" Apr 24 19:53:15.573756 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:15.573723 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" event={"ID":"76541b04-b9fb-4b4d-b9c6-e9487f3a1b80","Type":"ContainerDied","Data":"3e0de82566acdd2e9c3fbfd726ce203a78f1c57af3856009c727c236bf4396fe"} Apr 24 19:53:15.573930 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:15.573766 2573 scope.go:117] "RemoveContainer" containerID="357efcd04511f0ef09f6c6d898862eb9aa6be58c8ad5efb670de7370dd287401" Apr 24 19:53:15.573930 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:15.573776 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc" Apr 24 19:53:15.581273 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:15.581255 2573 scope.go:117] "RemoveContainer" containerID="da9331c428514f84e336cd605a2abb673834364cb157a3cde45107a6f3049fb0" Apr 24 19:53:15.589562 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:15.589540 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc"] Apr 24 19:53:15.595318 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:15.595297 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a0351-predictor-5d4c5bbb76-bj4lc"] Apr 24 19:53:17.379445 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:17.379408 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76541b04-b9fb-4b4d-b9c6-e9487f3a1b80" path="/var/lib/kubelet/pods/76541b04-b9fb-4b4d-b9c6-e9487f3a1b80/volumes" Apr 24 19:53:37.133484 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133448 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-clnxb/must-gather-9b8nr"] Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133695 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3f0c258-12af-4ace-a54c-9429480a099a" containerName="kube-rbac-proxy" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133705 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3f0c258-12af-4ace-a54c-9429480a099a" containerName="kube-rbac-proxy" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133715 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39c2c5c2-4106-44f1-9132-cc5e3065e04a" containerName="kserve-container" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133721 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c2c5c2-4106-44f1-9132-cc5e3065e04a" containerName="kserve-container" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133728 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="597f7cde-771c-4421-a1c8-c72bafa6833b" containerName="kube-rbac-proxy" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133734 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="597f7cde-771c-4421-a1c8-c72bafa6833b" containerName="kube-rbac-proxy" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133741 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d84ecd56-c38d-41d2-9cc4-a0177cc734a1" containerName="kube-rbac-proxy" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133745 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d84ecd56-c38d-41d2-9cc4-a0177cc734a1" containerName="kube-rbac-proxy" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133751 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc6093af-e5c9-4707-9146-0e5c8e6b48b3" containerName="kserve-container" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133756 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc6093af-e5c9-4707-9146-0e5c8e6b48b3" containerName="kserve-container" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133765 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76541b04-b9fb-4b4d-b9c6-e9487f3a1b80" containerName="kube-rbac-proxy" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133770 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="76541b04-b9fb-4b4d-b9c6-e9487f3a1b80" containerName="kube-rbac-proxy" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133776 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d84ecd56-c38d-41d2-9cc4-a0177cc734a1" containerName="kserve-container" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133780 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d84ecd56-c38d-41d2-9cc4-a0177cc734a1" containerName="kserve-container" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133788 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76541b04-b9fb-4b4d-b9c6-e9487f3a1b80" containerName="kserve-container" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133793 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="76541b04-b9fb-4b4d-b9c6-e9487f3a1b80" containerName="kserve-container" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133798 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc6093af-e5c9-4707-9146-0e5c8e6b48b3" containerName="kube-rbac-proxy" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133803 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc6093af-e5c9-4707-9146-0e5c8e6b48b3" containerName="kube-rbac-proxy" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133813 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="597f7cde-771c-4421-a1c8-c72bafa6833b" containerName="kserve-container" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133818 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="597f7cde-771c-4421-a1c8-c72bafa6833b" containerName="kserve-container" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133823 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39c2c5c2-4106-44f1-9132-cc5e3065e04a" containerName="kube-rbac-proxy" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133829 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c2c5c2-4106-44f1-9132-cc5e3065e04a" containerName="kube-rbac-proxy" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133836 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3f0c258-12af-4ace-a54c-9429480a099a" containerName="kserve-container" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133841 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3f0c258-12af-4ace-a54c-9429480a099a" containerName="kserve-container" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133880 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="dc6093af-e5c9-4707-9146-0e5c8e6b48b3" containerName="kube-rbac-proxy" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133889 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="597f7cde-771c-4421-a1c8-c72bafa6833b" containerName="kube-rbac-proxy" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133896 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d84ecd56-c38d-41d2-9cc4-a0177cc734a1" containerName="kserve-container" Apr 24 19:53:37.133890 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133902 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="39c2c5c2-4106-44f1-9132-cc5e3065e04a" containerName="kserve-container" Apr 24 19:53:37.134873 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133909 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3f0c258-12af-4ace-a54c-9429480a099a" containerName="kube-rbac-proxy" Apr 24 19:53:37.134873 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133915 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d84ecd56-c38d-41d2-9cc4-a0177cc734a1" containerName="kube-rbac-proxy" Apr 24 19:53:37.134873 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133921 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="76541b04-b9fb-4b4d-b9c6-e9487f3a1b80" containerName="kube-rbac-proxy" Apr 24 19:53:37.134873 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133926 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="dc6093af-e5c9-4707-9146-0e5c8e6b48b3" containerName="kserve-container" Apr 24 19:53:37.134873 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133932 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="597f7cde-771c-4421-a1c8-c72bafa6833b" containerName="kserve-container" Apr 24 19:53:37.134873 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133937 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3f0c258-12af-4ace-a54c-9429480a099a" containerName="kserve-container" Apr 24 19:53:37.134873 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133943 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="39c2c5c2-4106-44f1-9132-cc5e3065e04a" containerName="kube-rbac-proxy" Apr 24 19:53:37.134873 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.133948 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="76541b04-b9fb-4b4d-b9c6-e9487f3a1b80" containerName="kserve-container" Apr 24 19:53:37.137004 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.136987 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-clnxb/must-gather-9b8nr" Apr 24 19:53:37.140561 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.140539 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-clnxb\"/\"kube-root-ca.crt\"" Apr 24 19:53:37.140696 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.140585 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-clnxb\"/\"openshift-service-ca.crt\"" Apr 24 19:53:37.140696 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.140599 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-clnxb\"/\"default-dockercfg-jx8fh\"" Apr 24 19:53:37.143541 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.143509 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-clnxb/must-gather-9b8nr"] Apr 24 19:53:37.185609 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.185576 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szc55\" (UniqueName: \"kubernetes.io/projected/a0bf6186-5fbc-4562-b3b3-f2689c3489c0-kube-api-access-szc55\") pod \"must-gather-9b8nr\" (UID: \"a0bf6186-5fbc-4562-b3b3-f2689c3489c0\") " pod="openshift-must-gather-clnxb/must-gather-9b8nr" Apr 24 19:53:37.185776 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.185621 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a0bf6186-5fbc-4562-b3b3-f2689c3489c0-must-gather-output\") pod \"must-gather-9b8nr\" (UID: \"a0bf6186-5fbc-4562-b3b3-f2689c3489c0\") " pod="openshift-must-gather-clnxb/must-gather-9b8nr" Apr 24 19:53:37.285953 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.285920 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szc55\" (UniqueName: \"kubernetes.io/projected/a0bf6186-5fbc-4562-b3b3-f2689c3489c0-kube-api-access-szc55\") pod \"must-gather-9b8nr\" (UID: \"a0bf6186-5fbc-4562-b3b3-f2689c3489c0\") " pod="openshift-must-gather-clnxb/must-gather-9b8nr" Apr 24 19:53:37.286117 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.285968 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a0bf6186-5fbc-4562-b3b3-f2689c3489c0-must-gather-output\") pod \"must-gather-9b8nr\" (UID: \"a0bf6186-5fbc-4562-b3b3-f2689c3489c0\") " pod="openshift-must-gather-clnxb/must-gather-9b8nr" Apr 24 19:53:37.286243 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.286229 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a0bf6186-5fbc-4562-b3b3-f2689c3489c0-must-gather-output\") pod \"must-gather-9b8nr\" (UID: \"a0bf6186-5fbc-4562-b3b3-f2689c3489c0\") " pod="openshift-must-gather-clnxb/must-gather-9b8nr" Apr 24 19:53:37.295638 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.295613 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szc55\" (UniqueName: \"kubernetes.io/projected/a0bf6186-5fbc-4562-b3b3-f2689c3489c0-kube-api-access-szc55\") pod \"must-gather-9b8nr\" (UID: \"a0bf6186-5fbc-4562-b3b3-f2689c3489c0\") " pod="openshift-must-gather-clnxb/must-gather-9b8nr" Apr 24 19:53:37.447047 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.446947 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-clnxb/must-gather-9b8nr" Apr 24 19:53:37.570100 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.570070 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-clnxb/must-gather-9b8nr"] Apr 24 19:53:37.572352 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:53:37.572326 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0bf6186_5fbc_4562_b3b3_f2689c3489c0.slice/crio-d3b48de5cabb3689f9391c9aa2a5d555c1208a62a949e03aefb2e1dbcff726d0 WatchSource:0}: Error finding container d3b48de5cabb3689f9391c9aa2a5d555c1208a62a949e03aefb2e1dbcff726d0: Status 404 returned error can't find the container with id d3b48de5cabb3689f9391c9aa2a5d555c1208a62a949e03aefb2e1dbcff726d0 Apr 24 19:53:37.574055 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.574037 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:53:37.641936 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:37.641896 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-clnxb/must-gather-9b8nr" event={"ID":"a0bf6186-5fbc-4562-b3b3-f2689c3489c0","Type":"ContainerStarted","Data":"d3b48de5cabb3689f9391c9aa2a5d555c1208a62a949e03aefb2e1dbcff726d0"} Apr 24 19:53:38.650214 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:38.650128 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-clnxb/must-gather-9b8nr" event={"ID":"a0bf6186-5fbc-4562-b3b3-f2689c3489c0","Type":"ContainerStarted","Data":"b908704b8efa0be74e1d9c33a7efa0703c57f03b3f3d1bbe9d2a714532f68544"} Apr 24 19:53:39.655819 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:39.655776 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-clnxb/must-gather-9b8nr" event={"ID":"a0bf6186-5fbc-4562-b3b3-f2689c3489c0","Type":"ContainerStarted","Data":"2e7d8aab597090c16e439ed1e789ea7838357d28f877a03abe8e9e4fb3ec4fe8"} Apr 24 19:53:39.672496 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:39.672120 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-clnxb/must-gather-9b8nr" podStartSLOduration=1.849035852 podStartE2EDuration="2.67209911s" podCreationTimestamp="2026-04-24 19:53:37 +0000 UTC" firstStartedPulling="2026-04-24 19:53:37.574163261 +0000 UTC m=+2854.691503631" lastFinishedPulling="2026-04-24 19:53:38.397226509 +0000 UTC m=+2855.514566889" observedRunningTime="2026-04-24 19:53:39.670774705 +0000 UTC m=+2856.788115106" watchObservedRunningTime="2026-04-24 19:53:39.67209911 +0000 UTC m=+2856.789439505" Apr 24 19:53:39.904575 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:39.904530 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-wv5cl_7e7fa504-1a33-465d-aa64-5131b6adcc9f/global-pull-secret-syncer/0.log" Apr 24 19:53:40.017134 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:40.017104 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-zz6t5_ca30f8c7-a373-4425-93f6-cfa4c4634150/konnectivity-agent/0.log" Apr 24 19:53:40.084459 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:40.084416 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-6.ec2.internal_946221ebdc75fe55155252abed2eec40/haproxy/0.log" Apr 24 19:53:43.761271 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:43.761240 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-77d8db7bb6-r2jtz_84edfa12-7053-4df2-8b08-9c1938baf06c/metrics-server/0.log" Apr 24 19:53:43.945058 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:43.945030 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sg2th_3989dc21-ce63-4e0b-b43a-94257a2b6be9/node-exporter/0.log" Apr 24 19:53:43.964062 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:43.964036 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sg2th_3989dc21-ce63-4e0b-b43a-94257a2b6be9/kube-rbac-proxy/0.log" Apr 24 19:53:43.986865 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:43.986837 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sg2th_3989dc21-ce63-4e0b-b43a-94257a2b6be9/init-textfile/0.log" Apr 24 19:53:45.763296 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:45.763268 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-bnvtx_9ab7016b-4eb4-436f-945f-9e7f777cdd5a/networking-console-plugin/0.log" Apr 24 19:53:46.631762 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:46.631734 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-jwr7b_96655824-83e2-40dd-9a26-dc820359dbe0/download-server/0.log" Apr 24 19:53:46.719133 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:46.719102 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt"] Apr 24 19:53:46.723573 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:46.723550 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt" Apr 24 19:53:46.728868 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:46.728837 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt"] Apr 24 19:53:46.767123 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:46.767091 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2djdf\" (UniqueName: \"kubernetes.io/projected/af567ab2-f627-4c6f-9834-0b0c67aa09a3-kube-api-access-2djdf\") pod \"perf-node-gather-daemonset-kkjxt\" (UID: \"af567ab2-f627-4c6f-9834-0b0c67aa09a3\") " pod="openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt" Apr 24 19:53:46.767627 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:46.767134 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af567ab2-f627-4c6f-9834-0b0c67aa09a3-sys\") pod \"perf-node-gather-daemonset-kkjxt\" (UID: \"af567ab2-f627-4c6f-9834-0b0c67aa09a3\") " pod="openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt" Apr 24 19:53:46.767627 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:46.767197 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/af567ab2-f627-4c6f-9834-0b0c67aa09a3-proc\") pod \"perf-node-gather-daemonset-kkjxt\" (UID: \"af567ab2-f627-4c6f-9834-0b0c67aa09a3\") " pod="openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt" Apr 24 19:53:46.767627 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:46.767308 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/af567ab2-f627-4c6f-9834-0b0c67aa09a3-lib-modules\") pod \"perf-node-gather-daemonset-kkjxt\" (UID: \"af567ab2-f627-4c6f-9834-0b0c67aa09a3\") " pod="openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt" Apr 24 19:53:46.767627 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:46.767350 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/af567ab2-f627-4c6f-9834-0b0c67aa09a3-podres\") pod \"perf-node-gather-daemonset-kkjxt\" (UID: \"af567ab2-f627-4c6f-9834-0b0c67aa09a3\") " pod="openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt" Apr 24 19:53:46.868623 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:46.868588 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/af567ab2-f627-4c6f-9834-0b0c67aa09a3-lib-modules\") pod \"perf-node-gather-daemonset-kkjxt\" (UID: \"af567ab2-f627-4c6f-9834-0b0c67aa09a3\") " pod="openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt" Apr 24 19:53:46.868623 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:46.868629 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/af567ab2-f627-4c6f-9834-0b0c67aa09a3-podres\") pod \"perf-node-gather-daemonset-kkjxt\" (UID: \"af567ab2-f627-4c6f-9834-0b0c67aa09a3\") " pod="openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt" Apr 24 19:53:46.868866 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:46.868741 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/af567ab2-f627-4c6f-9834-0b0c67aa09a3-podres\") pod \"perf-node-gather-daemonset-kkjxt\" (UID: \"af567ab2-f627-4c6f-9834-0b0c67aa09a3\") " pod="openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt" Apr 24 19:53:46.868866 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:46.868759 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2djdf\" (UniqueName: \"kubernetes.io/projected/af567ab2-f627-4c6f-9834-0b0c67aa09a3-kube-api-access-2djdf\") pod \"perf-node-gather-daemonset-kkjxt\" (UID: \"af567ab2-f627-4c6f-9834-0b0c67aa09a3\") " pod="openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt" Apr 24 19:53:46.868866 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:46.868770 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/af567ab2-f627-4c6f-9834-0b0c67aa09a3-lib-modules\") pod \"perf-node-gather-daemonset-kkjxt\" (UID: \"af567ab2-f627-4c6f-9834-0b0c67aa09a3\") " pod="openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt" Apr 24 19:53:46.868866 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:46.868799 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af567ab2-f627-4c6f-9834-0b0c67aa09a3-sys\") pod \"perf-node-gather-daemonset-kkjxt\" (UID: \"af567ab2-f627-4c6f-9834-0b0c67aa09a3\") " pod="openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt" Apr 24 19:53:46.868866 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:46.868835 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/af567ab2-f627-4c6f-9834-0b0c67aa09a3-proc\") pod \"perf-node-gather-daemonset-kkjxt\" (UID: \"af567ab2-f627-4c6f-9834-0b0c67aa09a3\") " pod="openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt" Apr 24 19:53:46.869039 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:46.868904 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/af567ab2-f627-4c6f-9834-0b0c67aa09a3-proc\") pod \"perf-node-gather-daemonset-kkjxt\" (UID: \"af567ab2-f627-4c6f-9834-0b0c67aa09a3\") " pod="openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt" Apr 24 19:53:46.869039 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:46.868913 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af567ab2-f627-4c6f-9834-0b0c67aa09a3-sys\") pod \"perf-node-gather-daemonset-kkjxt\" (UID: \"af567ab2-f627-4c6f-9834-0b0c67aa09a3\") " pod="openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt" Apr 24 19:53:46.877791 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:46.877770 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2djdf\" (UniqueName: \"kubernetes.io/projected/af567ab2-f627-4c6f-9834-0b0c67aa09a3-kube-api-access-2djdf\") pod \"perf-node-gather-daemonset-kkjxt\" (UID: \"af567ab2-f627-4c6f-9834-0b0c67aa09a3\") " pod="openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt" Apr 24 19:53:47.035077 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:47.035028 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt" Apr 24 19:53:47.176901 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:47.176875 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt"] Apr 24 19:53:47.179393 ip-10-0-138-6 kubenswrapper[2573]: W0424 19:53:47.179365 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podaf567ab2_f627_4c6f_9834_0b0c67aa09a3.slice/crio-c817c4b7b03766c975d293aec091216f45246dfed90ef32057b15fd4843a68ae WatchSource:0}: Error finding container c817c4b7b03766c975d293aec091216f45246dfed90ef32057b15fd4843a68ae: Status 404 returned error can't find the container with id c817c4b7b03766c975d293aec091216f45246dfed90ef32057b15fd4843a68ae Apr 24 19:53:47.685835 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:47.685801 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt" event={"ID":"af567ab2-f627-4c6f-9834-0b0c67aa09a3","Type":"ContainerStarted","Data":"e235a74018dbb7fc8f8f66e4da9e3d2cb9b36aa394db1a0963d3bee8bfe6884c"} Apr 24 19:53:47.685835 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:47.685841 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt" event={"ID":"af567ab2-f627-4c6f-9834-0b0c67aa09a3","Type":"ContainerStarted","Data":"c817c4b7b03766c975d293aec091216f45246dfed90ef32057b15fd4843a68ae"} Apr 24 19:53:47.686057 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:47.685923 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt" Apr 24 19:53:47.702522 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:47.702481 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt" podStartSLOduration=1.7024669860000001 podStartE2EDuration="1.702466986s" podCreationTimestamp="2026-04-24 19:53:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:53:47.700576419 +0000 UTC m=+2864.817916821" watchObservedRunningTime="2026-04-24 19:53:47.702466986 +0000 UTC m=+2864.819807373" Apr 24 19:53:47.730615 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:47.730591 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6d2pw_68ba56d1-dd32-46c1-8484-b4074baf3f3f/dns/0.log" Apr 24 19:53:47.748870 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:47.748850 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6d2pw_68ba56d1-dd32-46c1-8484-b4074baf3f3f/kube-rbac-proxy/0.log" Apr 24 19:53:47.808106 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:47.808053 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-b5mbc_3ae0b497-fe7c-4446-ba6a-1177ca3da41b/dns-node-resolver/0.log" Apr 24 19:53:48.263241 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:48.263213 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-75d6c9bd47-mc8md_8e292081-ae2a-40a6-b7fa-6d1463c221f2/registry/0.log" Apr 24 19:53:48.326474 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:48.326426 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ntjd2_e5bcfed1-92e6-4764-9896-4a9fc77aaef9/node-ca/0.log" Apr 24 19:53:49.344309 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:49.344280 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-5sq4k_eea44eed-90c5-4bbc-b836-55ef49678cf3/serve-healthcheck-canary/0.log" Apr 24 19:53:49.811743 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:49.811707 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9hxjz_623c1c61-52af-445b-b5cd-8972d473f55d/kube-rbac-proxy/0.log" Apr 24 19:53:49.859615 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:49.859586 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9hxjz_623c1c61-52af-445b-b5cd-8972d473f55d/exporter/0.log" Apr 24 19:53:49.880839 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:49.880812 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9hxjz_623c1c61-52af-445b-b5cd-8972d473f55d/extractor/0.log" Apr 24 19:53:51.982985 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:51.982942 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-9ct4v_128f5402-7d31-4587-a80a-9e48a31b06ca/server/0.log" Apr 24 19:53:53.704209 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:53.704178 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-clnxb/perf-node-gather-daemonset-kkjxt" Apr 24 19:53:56.470563 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:56.470523 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-sspcx_10b3548c-86e6-44e8-9141-042fa481976e/migrator/0.log" Apr 24 19:53:56.497801 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:56.497655 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-sspcx_10b3548c-86e6-44e8-9141-042fa481976e/graceful-termination/0.log" Apr 24 19:53:58.143846 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:58.143815 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t4krc_6b161493-342d-489a-a1b5-2d34fb7236d6/kube-multus-additional-cni-plugins/0.log" Apr 24 19:53:58.168854 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:58.168784 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t4krc_6b161493-342d-489a-a1b5-2d34fb7236d6/egress-router-binary-copy/0.log" Apr 24 19:53:58.188414 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:58.188390 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t4krc_6b161493-342d-489a-a1b5-2d34fb7236d6/cni-plugins/0.log" Apr 24 19:53:58.213064 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:58.213045 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t4krc_6b161493-342d-489a-a1b5-2d34fb7236d6/bond-cni-plugin/0.log" Apr 24 19:53:58.237903 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:58.237865 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t4krc_6b161493-342d-489a-a1b5-2d34fb7236d6/routeoverride-cni/0.log" Apr 24 19:53:58.258605 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:58.258567 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t4krc_6b161493-342d-489a-a1b5-2d34fb7236d6/whereabouts-cni-bincopy/0.log" Apr 24 19:53:58.277158 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:58.277137 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-t4krc_6b161493-342d-489a-a1b5-2d34fb7236d6/whereabouts-cni/0.log" Apr 24 19:53:58.304793 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:58.304766 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lbrbg_cfb7edc3-113a-4b12-83d1-66356304b80c/kube-multus/0.log" Apr 24 19:53:58.403351 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:58.403319 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tf94j_b0872aa7-303f-4052-9d68-dd136609293b/network-metrics-daemon/0.log" Apr 24 19:53:58.420775 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:58.420711 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tf94j_b0872aa7-303f-4052-9d68-dd136609293b/kube-rbac-proxy/0.log" Apr 24 19:53:59.712270 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:59.712239 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt2vz_cfb1fffa-6d48-4ac4-ae37-ea4c1839473f/ovn-controller/0.log" Apr 24 19:53:59.744614 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:59.744588 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt2vz_cfb1fffa-6d48-4ac4-ae37-ea4c1839473f/ovn-acl-logging/0.log" Apr 24 19:53:59.763974 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:59.763939 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt2vz_cfb1fffa-6d48-4ac4-ae37-ea4c1839473f/kube-rbac-proxy-node/0.log" Apr 24 19:53:59.782850 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:59.782830 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt2vz_cfb1fffa-6d48-4ac4-ae37-ea4c1839473f/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 19:53:59.802289 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:59.802270 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt2vz_cfb1fffa-6d48-4ac4-ae37-ea4c1839473f/northd/0.log" Apr 24 19:53:59.820833 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:59.820811 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt2vz_cfb1fffa-6d48-4ac4-ae37-ea4c1839473f/nbdb/0.log" Apr 24 19:53:59.841008 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:59.840987 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt2vz_cfb1fffa-6d48-4ac4-ae37-ea4c1839473f/sbdb/0.log" Apr 24 19:53:59.972678 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:53:59.972599 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wt2vz_cfb1fffa-6d48-4ac4-ae37-ea4c1839473f/ovnkube-controller/0.log" Apr 24 19:54:00.990097 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:54:00.990069 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-nrb24_f5236623-3273-4733-a194-9bfd58303272/network-check-target-container/0.log" Apr 24 19:54:01.821167 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:54:01.821138 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-8qp2b_0e855347-f4fa-493a-b42a-57f880bfc25d/iptables-alerter/0.log" Apr 24 19:54:02.550293 ip-10-0-138-6 kubenswrapper[2573]: I0424 19:54:02.550262 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-6wv5x_7f546568-c9ad-4518-a7a9-893e659002a9/tuned/0.log"